diff --git a/bom/openhab-addons/pom.xml b/bom/openhab-addons/pom.xml
index ec063bbc94083..68e0ff4939941 100644
--- a/bom/openhab-addons/pom.xml
+++ b/bom/openhab-addons/pom.xml
@@ -16,6 +16,46 @@
openHAB Add-ons :: BOM :: openHAB Add-ons
+
+ org.openhab.addons.bundles
+ org.openhab.persistence.dynamodb
+ ${project.version}
+
+
+ org.openhab.addons.bundles
+ org.openhab.persistence.influxdb
+ ${project.version}
+
+
+ org.openhab.addons.bundles
+ org.openhab.persistence.jdbc
+ ${project.version}
+
+
+ org.openhab.addons.bundles
+ org.openhab.persistence.jpa
+ ${project.version}
+
+
+ org.openhab.addons.bundles
+ org.openhab.persistence.mapdb
+ ${project.version}
+
+
+ org.openhab.addons.bundles
+ org.openhab.persistence.mongodb
+ ${project.version}
+
+
+ org.openhab.addons.bundles
+ org.openhab.persistence.mysql
+ ${project.version}
+
+
+ org.openhab.addons.bundles
+ org.openhab.persistence.rrd4j
+ ${project.version}
+
diff --git a/bundles/org.openhab.persistence.dynamodb/.classpath b/bundles/org.openhab.persistence.dynamodb/.classpath
new file mode 100644
index 0000000000000..19368e503c1f5
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/.classpath
@@ -0,0 +1,27 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/bundles/org.openhab.persistence.dynamodb/.project b/bundles/org.openhab.persistence.dynamodb/.project
new file mode 100644
index 0000000000000..af66634e644ab
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/.project
@@ -0,0 +1,23 @@
+
+
+ org.openhab.persistence.dynamodb
+
+
+
+
+
+ org.eclipse.jdt.core.javabuilder
+
+
+
+
+ org.eclipse.m2e.core.maven2Builder
+
+
+
+
+
+ org.eclipse.jdt.core.javanature
+ org.eclipse.m2e.core.maven2Nature
+
+
diff --git a/bundles/org.openhab.persistence.dynamodb/ESH-INF/config/config.xml b/bundles/org.openhab.persistence.dynamodb/ESH-INF/config/config.xml
new file mode 100644
index 0000000000000..6abf96b5ee850
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/ESH-INF/config/config.xml
@@ -0,0 +1,117 @@
+
+
+
+
+
+
+
+ AWS region ID
+
+ The region needs to match the region of the AWS user that will access Amazon DynamoDB.
+ For example, eu-west-1.]]>
+
+
+
+ AWS access key
+
+ Give either 1) access key and secret key, or 2) credentials file and profile name.
+ ]]>
+
+
+
+ AWS secret key
+
+ Give either 1) access key and secret key, or 2) credentials file and profile name.
+ ]]>
+
+
+
+
+ AWS credentials file
+
+ For example, /etc/openhab2/aws_creds.
+ Please note that the user that runs openHAB must have approriate read rights to the credential file.
+
+ Give either 1) access key and secret key, or 2) credentials file and profile name.
+ ]]>
+
+
+
+ Profile name
+
+ Give either 1) access key and secret key, or 2) credentials file and profile name.
+ ]]>
+
+
+
+
+
+ Read capacity for the created tables. Default is 1.
+ Read capacity
+ true
+
+
+
+ Write capacity
+ Write capacity for the created tables. Default is 1.
+ true
+
+
+
+ Table prefix
+ Table prefix used in the name of created tables. Default is openhab-
+ true
+
+
+
+
+
\ No newline at end of file
diff --git a/bundles/org.openhab.persistence.dynamodb/NOTICE b/bundles/org.openhab.persistence.dynamodb/NOTICE
new file mode 100644
index 0000000000000..6c17d0d8a455b
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/NOTICE
@@ -0,0 +1,14 @@
+This content is produced and maintained by the openHAB project.
+
+* Project home: https://www.openhab.org
+
+== Declared Project Licenses
+
+This program and the accompanying materials are made available under the terms
+of the Eclipse Public License 2.0 which is available at
+https://www.eclipse.org/legal/epl-2.0/.
+
+== Source Code
+
+https://github.com/openhab/openhab-core
+
diff --git a/bundles/org.openhab.persistence.dynamodb/README.md b/bundles/org.openhab.persistence.dynamodb/README.md
new file mode 100644
index 0000000000000..9b3b815e91bcf
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/README.md
@@ -0,0 +1,161 @@
+# Amazon DynamoDB Persistence
+
+This service allows you to persist state updates using the [Amazon DynamoDB](https://aws.amazon.com/dynamodb/) database. Query functionality is also fully supported.
+
+Features:
+
+* Writing/reading information to relational database systems
+* Configurable database table names
+* Automatic table creation
+
+## Disclaimer
+
+This service is provided "AS IS", and the user takes full responsibility of any charges or damage to Amazon data.
+
+## Table of Contents
+
+
+
+
+- [Prerequisites](#prerequisites)
+ - [Setting Up an Amazon Account](#setting-up-an-amazon-account)
+- [Configuration](#configuration)
+ - [Basic configuration](#basic-configuration)
+ - [Configuration Using Credentials File](#configuration-using-credentials-file)
+ - [Advanced Configuration](#advanced-configuration)
+- [Details](#details)
+ - [Tables Creation](#tables-creation)
+ - [Caveats](#caveats)
+- [Developer Notes](#developer-notes)
+ - [Updating Amazon SDK](#updating-amazon-sdk)
+
+
+
+## Prerequisites
+
+You must first set up an Amazon account as described below.
+
+Users are recommended to familiarize themselves with AWS pricing before using this service. Please note that there might be charges from Amazon when using this service to query/store data to DynamoDB. See [Amazon DynamoDB pricing pages](https://aws.amazon.com/dynamodb/pricing/) for more details. Please also note possible [Free Tier](https://aws.amazon.com/free/) benefits.
+
+### Setting Up an Amazon Account
+
+* [Sign up](https://aws.amazon.com/) for Amazon AWS.
+* Select the AWS region in the [AWS console](https://console.aws.amazon.com/) using [these instructions](https://docs.aws.amazon.com/awsconsolehelpdocs/latest/gsg/getting-started.html#select-region). Note the region identifier in the URL (e.g. `https://eu-west-1.console.aws.amazon.com/console/home?region=eu-west-1` means that region id is `eu-west-1`).
+* **Create user for openHAB with IAM**
+ * Open Services -> IAM -> Users -> Create new Users. Enter `openhab` to _User names_, keep _Generate an access key for each user_ checked, and finally click _Create_.
+ * _Show User Security Credentials_ and record the keys displayed
+* **Configure user policy to have access for dynamodb**
+ * Open Services -> IAM -> Policies
+ * Check _AmazonDynamoDBFullAccess_ and click _Policy actions_ -> _Attach_
+ * Check the user created in step 2 and click _Attach policy_
+
+## Configuration
+
+This service can be configured in the file `services/dynamodb.cfg`.
+
+### Basic configuration
+
+| Property | Default | Required | Description |
+| --------- | ------- | :------: | ------------------------------------------------------------------------------------------------------------------------------------------------------------- |
+| accessKey | | Yes | access key as shown in [Setting up Amazon account](#setting-up-an-amazon-account). |
+| secretKey | | Yes | secret key as shown in [Setting up Amazon account](#setting-up-an-amazon-account). |
+| region | | Yes | AWS region ID as described in [Setting up Amazon account](#setting-up-an-amazon-account). The region needs to match the region that was used to create the user. |
+
+### Configuration Using Credentials File
+
+Alternatively, instead of specifying `accessKey` and `secretKey`, one can configure a configuration profile file.
+
+| Property | Default | Required | Description |
+| ------------------ | ------- | :------: | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
+| profilesConfigFile | | Yes | path to the credentials file. For example, `/etc/openhab2/aws_creds`. Please note that the user that runs openHAB must have approriate read rights to the credential file. For more details on the Amazon credential file format, see [Amazon documentation](https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-getting-started.html). |
+| profile | | Yes | name of the profile to use |
+| region | | Yes | AWS region ID as described in Step 2 in [Setting up Amazon account](#setting-up-an-amazon-account). The region needs to match the region that was used to create the user. |
+
+Example of service configuration file (`services/dynamodb.cfg`):
+
+```ini
+profilesConfigFile=/etc/openhab2/aws_creds
+profile=fooprofile
+region=eu-west-1
+```
+
+Example of credentials file (`/etc/openhab2/aws_creds`):
+
+````ini
+[fooprofile]
+aws_access_key_id=testAccessKey
+aws_secret_access_key=testSecretKey
+````
+
+### Advanced Configuration
+
+In addition to the configuration properties above, the following are also available:
+
+| Property | Default | Required | Description |
+| -------------------------- | ---------- | :------: | -------------------------------------------------------------------------------------------------- |
+| readCapacityUnits | 1 | No | read capacity for the created tables |
+| writeCapacityUnits | 1 | No | write capacity for the created tables |
+| tablePrefix | `openhab-` | No | table prefix used in the name of created tables |
+| bufferCommitIntervalMillis | 1000 | No | Interval to commit (write) buffered data. In milliseconds. |
+| bufferSize | 1000 | No | Internal buffer size in datapoints which is used to batch writes to DynamoDB every `bufferCommitIntervalMillis`. |
+
+Typically you should not need to modify parameters related to buffering.
+
+Refer to Amazon documentation on [provisioned throughput](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/HowItWorks.ProvisionedThroughput.html) for details on read/write capacity.
+
+All item- and event-related configuration is done in the file `persistence/dynamodb.persist`.
+
+## Details
+
+### Tables Creation
+
+When an item is persisted via this service, a table is created (if necessary). Currently, the service will create at most two tables for different item types. The tables will be named ``, where the `` is either `bigdecimal` (numeric items) or `string` (string and complex items).
+
+Each table will have three columns: `itemname` (item name), `timeutc` (in ISO 8601 format with millisecond accuracy), and `itemstate` (either a number or string representing item state).
+
+## Buffering
+
+By default, the service is asynchronous which means that data is not written immediately to DynamoDB but instead buffered in-memory.
+The size of the buffer, in terms of datapoints, can be configured with `bufferSize`.
+Every `bufferCommitIntervalMillis` the whole buffer of data is flushed to DynamoDB.
+
+It is recommended to have the buffering enabled since the synchronous behaviour (writing data immediately) might have adverse impact to the whole system when there is many items persisted at the same time. The buffering can be disabled by setting `bufferSize` to zero.
+
+The defaults should be suitable in many use cases.
+
+### Caveats
+
+When the tables are created, the read/write capacity is configured according to configuration. However, the service does not modify the capacity of existing tables. As a workaround, you can modify the read/write capacity of existing tables using the [Amazon console](https://aws.amazon.com/console/).
+
+## Developer Notes
+
+### Updating Amazon SDK
+
+1. Clean `lib/*`
+2. Update SDK version in `scripts/fetch_sdk_pom.xml`. You can use the [maven online repository browser](https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk-dynamodb) to find the latest version available online.
+3. `scripts/fetch_sdk.sh`
+4. Copy `scripts/target/site/dependencies.html` and `scripts/target/dependency/*.jar` to `lib/`
+5. Generate `build.properties` entries
+`ls lib/*.jar | python -c "import sys; print(' ' + ',\\\\\\n '.join(map(str.strip, sys.stdin.readlines())))"`
+6. Generate `META-INF/MANIFEST.MF` `Bundle-ClassPath` entries
+`ls lib/*.jar | python -c "import sys; print(' ' + ',\\n '.join(map(str.strip, sys.stdin.readlines())))"`
+7. Generate `.classpath` entries
+`ls lib/*.jar | python -c "import sys;pre=''; print('\\t' + pre + (post + '\\n\\t' + pre).join(map(str.strip, sys.stdin.readlines())) + post)"`
+
+After these changes, it's good practice to run integration tests (against live AWS DynamoDB) in `org.openhab.persistence.dynamodb.test` bundle. See README.md in the test bundle for more information how to execute the tests.
+
+### Running integration tests
+
+To run integration tests, one needs to provide AWS credentials.
+
+Eclipse instructions
+1. Run all tests (in package org.openhab.persistence.dynamodb.internal) as JUnit Tests
+2. Configure the run configuration, and open Arguments sheet
+3. In VM arguments, provide the credentials for AWS
+````
+-DDYNAMODBTEST_REGION=REGION-ID
+-DDYNAMODBTEST_ACCESS=ACCESS-KEY
+-DDYNAMODBTEST_SECRET=SECRET
+````
+
+The tests will create tables with prefix `dynamodb-integration-tests-`. Note that when tests are begun, all data is removed from that table!
diff --git a/bundles/org.openhab.persistence.dynamodb/pom.xml b/bundles/org.openhab.persistence.dynamodb/pom.xml
new file mode 100644
index 0000000000000..b74d7b0dfa2de
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/pom.xml
@@ -0,0 +1,114 @@
+
+
+
+ 4.0.0
+
+
+ org.openhab.addons.bundles
+ org.openhab.addons.reactor.bundles
+ 3.0.0-SNAPSHOT
+
+
+ org.openhab.persistence.dynamodb
+
+ openHAB Add-ons :: Bundles :: Persistence Service :: DynamoDB
+
+
+ !com.amazonaws.*,!org.joda.convert.*,!com.sun.org.apache.xpath.*,!kotlin,!org.apache.log.*,!org.bouncycastle.*,!org.apache.avalon.*
+
+
+
+
+
+ com.amazonaws
+ aws-java-sdk-core
+ 1.11.213
+
+
+ com.amazonaws
+ aws-java-sdk-dynamodb
+ 1.11.213
+
+
+
+ com.amazonaws
+ aws-java-sdk-kms
+ 1.11.213
+
+
+
+ com.amazonaws
+ aws-java-sdk-s3
+ 1.11.213
+
+
+
+ com.amazonaws
+ jmespath-java
+ 1.11.213
+
+
+
+ org.apache.httpcomponents
+ httpclient
+ 4.5.2
+
+
+
+ software.amazon.ion
+ ion-java
+ 1.0.2
+
+
+
+ org.apache.httpcomponents
+ httpcore
+ 4.4.4
+
+
+
+ commons-logging
+ commons-logging
+ 1.1.3
+
+
+
+ commons-codec
+ commons-codec
+ 1.9
+
+
+
+ joda-time
+ joda-time
+ 2.8.1
+
+
+
+
+
+
+ com.fasterxml.jackson.core
+ jackson-annotations
+ 2.6.0
+
+
+
+ com.fasterxml.jackson.core
+ jackson-core
+ 2.6.7
+
+
+
+ com.fasterxml.jackson.core
+ jackson-databind
+ 2.6.7.1
+
+
+
+ com.fasterxml.jackson.dataformat
+ jackson-dataformat-cbor
+ 2.6.7
+
+
+
diff --git a/bundles/org.openhab.persistence.dynamodb/scripts/fetch_sdk.sh b/bundles/org.openhab.persistence.dynamodb/scripts/fetch_sdk.sh
new file mode 100755
index 0000000000000..9200e6f0eabe7
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/scripts/fetch_sdk.sh
@@ -0,0 +1,5 @@
+#!/usr/bin/env bash
+DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+mvn -f $DIR/fetch_sdk_pom.xml clean process-sources project-info-reports:dependencies
+
+echo "Check $DIR/target/site/dependencies.html and $DIR/target/dependency"
\ No newline at end of file
diff --git a/bundles/org.openhab.persistence.dynamodb/scripts/fetch_sdk_pom.xml b/bundles/org.openhab.persistence.dynamodb/scripts/fetch_sdk_pom.xml
new file mode 100644
index 0000000000000..203af387d8c2d
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/scripts/fetch_sdk_pom.xml
@@ -0,0 +1,37 @@
+
+
+ 4.0.0
+ groupId
+ artifactId
+ 1.0
+
+
+
+ com.amazonaws
+ aws-java-sdk-dynamodb
+ 1.11.213
+
+
+
+
+
+
+ maven-dependency-plugin
+
+
+ process-sources
+
+
+ copy-dependencies
+
+
+
+ ${targetdirectory}
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/bundles/org.openhab.persistence.dynamodb/src/main/feature/feature.xml b/bundles/org.openhab.persistence.dynamodb/src/main/feature/feature.xml
new file mode 100644
index 0000000000000..675ae135e72b9
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/main/feature/feature.xml
@@ -0,0 +1,11 @@
+
+
+ mvn:org.openhab.core.features.karaf/org.openhab.core.features.karaf.openhab-core/${ohc.version}/xml/features
+
+
+ openhab-runtime-base
+ mvn:org.openhab.addons.bundles/org.openhab.persistence.dynamodb/${project.version}
+ mvn:${project.groupId}/openhab-addons-external3/${project.version}/cfg/dynamodb
+
+
+
diff --git a/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/AbstractBufferedPersistenceService.java b/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/AbstractBufferedPersistenceService.java
new file mode 100644
index 0000000000000..4f04277ee8784
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/AbstractBufferedPersistenceService.java
@@ -0,0 +1,127 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.util.Date;
+import java.util.UUID;
+import java.util.concurrent.ArrayBlockingQueue;
+import java.util.concurrent.BlockingQueue;
+import java.util.concurrent.TimeUnit;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.core.items.Item;
+import org.openhab.core.persistence.PersistenceService;
+import org.openhab.core.types.State;
+import org.openhab.core.types.UnDefType;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Abstract class for buffered persistence services
+ *
+ * @param Type of the state as accepted by the AWS SDK.
+ *
+ * @author Sami Salonen - Initial contribution
+ * @author Kai Kreuzer - Migration to 3.x
+ *
+ */
+@NonNullByDefault
+public abstract class AbstractBufferedPersistenceService implements PersistenceService {
+
+ private static final long BUFFER_OFFER_TIMEOUT_MILLIS = 500;
+
+ private final Logger logger = LoggerFactory.getLogger(AbstractBufferedPersistenceService.class);
+ protected @Nullable BlockingQueue buffer;
+
+ private boolean writeImmediately;
+
+ protected void resetWithBufferSize(int bufferSize) {
+ int capacity = Math.max(1, bufferSize);
+ buffer = new ArrayBlockingQueue(capacity, true);
+ writeImmediately = bufferSize == 0;
+ }
+
+ protected abstract T persistenceItemFromState(String name, State state, Date time);
+
+ protected abstract boolean isReadyToStore();
+
+ protected abstract void flushBufferedData();
+
+ @Override
+ public void store(Item item) {
+ store(item, null);
+ }
+
+ @Override
+ public void store(Item item, @Nullable String alias) {
+ long storeStart = System.currentTimeMillis();
+ String uuid = UUID.randomUUID().toString();
+ if (item.getState() instanceof UnDefType) {
+ logger.debug("Undefined item state received. Not storing item {}.", item.getName());
+ return;
+ }
+ if (!isReadyToStore()) {
+ return;
+ }
+ if (buffer == null) {
+ throw new IllegalStateException("Buffer not initialized with resetWithBufferSize. Bug?");
+ }
+ Date time = new Date(storeStart);
+ String realName = item.getName();
+ String name = (alias != null) ? alias : realName;
+ State state = item.getState();
+ T persistenceItem = persistenceItemFromState(name, state, time);
+ logger.trace("store() called with item {}, which was converted to {} [{}]", item, persistenceItem, uuid);
+ if (writeImmediately) {
+ logger.debug("Writing immediately item {} [{}]", realName, uuid);
+ // We want to write everything immediately
+ // Synchronous behavior to ensure buffer does not get full.
+ synchronized (this) {
+ boolean buffered = addToBuffer(persistenceItem);
+ assert buffered;
+ flushBufferedData();
+ }
+ } else {
+ long bufferStart = System.currentTimeMillis();
+ boolean buffered = addToBuffer(persistenceItem);
+ if (buffered) {
+ logger.debug("Buffered item {} in {} ms. Total time for store(): {} [{}]", realName,
+ System.currentTimeMillis() - bufferStart, System.currentTimeMillis() - storeStart, uuid);
+ } else {
+ logger.debug(
+ "Buffer is full. Writing buffered data immediately and trying again. Consider increasing bufferSize");
+ // Buffer is full, commit it immediately
+ flushBufferedData();
+ boolean buffered2 = addToBuffer(persistenceItem);
+ if (buffered2) {
+ logger.debug("Buffered item in {} ms (2nd try, flushed buffer in-between) [{}]",
+ System.currentTimeMillis() - bufferStart, uuid);
+ } else {
+ // The unlikely case happened -- buffer got full again immediately
+ logger.warn("Buffering failed for the second time -- Too small bufferSize? Discarding data [{}]",
+ uuid);
+ }
+ }
+ }
+ }
+
+ protected boolean addToBuffer(T persistenceItem) {
+ try {
+ return buffer != null && buffer.offer(persistenceItem, BUFFER_OFFER_TIMEOUT_MILLIS, TimeUnit.MILLISECONDS);
+ } catch (InterruptedException e) {
+ logger.warn("Interrupted when trying to buffer data! Dropping data");
+ return false;
+ }
+ }
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/AbstractDynamoDBItem.java b/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/AbstractDynamoDBItem.java
new file mode 100644
index 0000000000000..cfd15a0b1940b
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/AbstractDynamoDBItem.java
@@ -0,0 +1,211 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.math.BigDecimal;
+import java.text.DateFormat;
+import java.text.ParseException;
+import java.text.SimpleDateFormat;
+import java.util.Calendar;
+import java.util.Date;
+import java.util.HashMap;
+import java.util.Map;
+import java.util.TimeZone;
+
+import org.openhab.core.items.Item;
+import org.openhab.core.library.items.CallItem;
+import org.openhab.core.library.items.ColorItem;
+import org.openhab.core.library.items.ContactItem;
+import org.openhab.core.library.items.DateTimeItem;
+import org.openhab.core.library.items.DimmerItem;
+import org.openhab.core.library.items.LocationItem;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.core.library.items.RollershutterItem;
+import org.openhab.core.library.items.StringItem;
+import org.openhab.core.library.items.SwitchItem;
+import org.openhab.core.library.types.DateTimeType;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.library.types.HSBType;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.library.types.OpenClosedType;
+import org.openhab.core.library.types.PercentType;
+import org.openhab.core.library.types.PointType;
+import org.openhab.core.library.types.StringListType;
+import org.openhab.core.library.types.StringType;
+import org.openhab.core.library.types.UpDownType;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.types.State;
+import org.openhab.core.types.UnDefType;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Base class for all DynamoDBItem. Represents openHAB Item serialized in a suitable format for the database
+ *
+ * @param Type of the state as accepted by the AWS SDK.
+ *
+ * @author Sami Salonen - Initial contribution
+ */
+public abstract class AbstractDynamoDBItem implements DynamoDBItem {
+
+ public static final SimpleDateFormat DATEFORMATTER = new SimpleDateFormat(DATE_FORMAT);
+
+ static {
+ DATEFORMATTER.setTimeZone(TimeZone.getTimeZone("UTC"));
+ }
+
+ private static final String UNDEFINED_PLACEHOLDER = "";
+
+ static final Map, Class extends DynamoDBItem>>> itemClassToDynamoItemClass = new HashMap, Class extends DynamoDBItem>>>();
+
+ static {
+ itemClassToDynamoItemClass.put(CallItem.class, DynamoDBStringItem.class);
+ itemClassToDynamoItemClass.put(ContactItem.class, DynamoDBBigDecimalItem.class);
+ itemClassToDynamoItemClass.put(DateTimeItem.class, DynamoDBStringItem.class);
+ itemClassToDynamoItemClass.put(LocationItem.class, DynamoDBStringItem.class);
+ itemClassToDynamoItemClass.put(NumberItem.class, DynamoDBBigDecimalItem.class);
+ itemClassToDynamoItemClass.put(RollershutterItem.class, DynamoDBBigDecimalItem.class);
+ itemClassToDynamoItemClass.put(StringItem.class, DynamoDBStringItem.class);
+ itemClassToDynamoItemClass.put(SwitchItem.class, DynamoDBBigDecimalItem.class);
+ itemClassToDynamoItemClass.put(DimmerItem.class, DynamoDBBigDecimalItem.class); // inherited from SwitchItem (!)
+ itemClassToDynamoItemClass.put(ColorItem.class, DynamoDBStringItem.class); // inherited from DimmerItem
+ }
+
+ public static final Class> getDynamoItemClass(Class extends Item> itemClass)
+ throws NullPointerException {
+ @SuppressWarnings("unchecked")
+ Class> dtoclass = (Class>) itemClassToDynamoItemClass.get(itemClass);
+ if (dtoclass == null) {
+ throw new IllegalArgumentException(String.format("Unknown item class %s", itemClass));
+ }
+ return dtoclass;
+ }
+
+ private final Logger logger = LoggerFactory.getLogger(AbstractDynamoDBItem.class);
+
+ protected String name;
+ protected T state;
+ protected Date time;
+
+ public AbstractDynamoDBItem(String name, T state, Date time) {
+ this.name = name;
+ this.state = state;
+ this.time = time;
+ }
+
+ public static DynamoDBItem> fromState(String name, State state, Date time) {
+ if (state instanceof DecimalType && !(state instanceof HSBType)) {
+ // also covers PercentType which is inherited from DecimalType
+ return new DynamoDBBigDecimalItem(name, ((DecimalType) state).toBigDecimal(), time);
+ } else if (state instanceof OnOffType) {
+ return new DynamoDBBigDecimalItem(name,
+ ((OnOffType) state) == OnOffType.ON ? BigDecimal.ONE : BigDecimal.ZERO, time);
+ } else if (state instanceof OpenClosedType) {
+ return new DynamoDBBigDecimalItem(name,
+ ((OpenClosedType) state) == OpenClosedType.OPEN ? BigDecimal.ONE : BigDecimal.ZERO, time);
+ } else if (state instanceof UpDownType) {
+ return new DynamoDBBigDecimalItem(name,
+ ((UpDownType) state) == UpDownType.UP ? BigDecimal.ONE : BigDecimal.ZERO, time);
+ } else if (state instanceof DateTimeType) {
+ return new DynamoDBStringItem(name, DATEFORMATTER.format(((DateTimeType) state).getCalendar().getTime()),
+ time);
+ } else if (state instanceof UnDefType) {
+ return new DynamoDBStringItem(name, UNDEFINED_PLACEHOLDER, time);
+ } else if (state instanceof StringListType) {
+ // StringListType.format method instead of toString since that matches the format expected by the String
+ // constructor
+ return new DynamoDBStringItem(name, state.toFullString(), time);
+ } else {
+ // HSBType, PointType and StringType
+ return new DynamoDBStringItem(name, state.toFullString(), time);
+ }
+ }
+
+ @Override
+ public HistoricItem asHistoricItem(final Item item) {
+ final State[] state = new State[1];
+ accept(new DynamoDBItemVisitor() {
+
+ @Override
+ public void visit(DynamoDBStringItem dynamoStringItem) {
+ if (item instanceof ColorItem) {
+ state[0] = new HSBType(dynamoStringItem.getState());
+ } else if (item instanceof LocationItem) {
+ state[0] = new PointType(dynamoStringItem.getState());
+ } else if (item instanceof DateTimeItem) {
+ Calendar cal = Calendar.getInstance(TimeZone.getTimeZone("UTC"));
+ try {
+ cal.setTime(DATEFORMATTER.parse(dynamoStringItem.getState()));
+ } catch (ParseException e) {
+ logger.warn("Failed to parse {} as date. Outputting UNDEF instead",
+ dynamoStringItem.getState());
+ state[0] = UnDefType.UNDEF;
+ }
+ state[0] = new DateTimeType(cal);
+ } else if (dynamoStringItem.getState().equals(UNDEFINED_PLACEHOLDER)) {
+ state[0] = UnDefType.UNDEF;
+ } else if (item instanceof CallItem) {
+ String parts = dynamoStringItem.getState();
+ String[] strings = parts.split(",");
+ String orig = strings[0];
+ String dest = strings[1];
+ state[0] = new StringListType(orig, dest);
+ } else {
+ state[0] = new StringType(dynamoStringItem.getState());
+ }
+ }
+
+ @Override
+ public void visit(DynamoDBBigDecimalItem dynamoBigDecimalItem) {
+ if (item instanceof NumberItem) {
+ state[0] = new DecimalType(dynamoBigDecimalItem.getState());
+ } else if (item instanceof DimmerItem) {
+ state[0] = new PercentType(dynamoBigDecimalItem.getState());
+ } else if (item instanceof SwitchItem) {
+ state[0] = dynamoBigDecimalItem.getState().compareTo(BigDecimal.ONE) == 0 ? OnOffType.ON
+ : OnOffType.OFF;
+ } else if (item instanceof ContactItem) {
+ state[0] = dynamoBigDecimalItem.getState().compareTo(BigDecimal.ONE) == 0 ? OpenClosedType.OPEN
+ : OpenClosedType.CLOSED;
+ } else if (item instanceof RollershutterItem) {
+ state[0] = new PercentType(dynamoBigDecimalItem.getState());
+ } else {
+ logger.warn("Not sure how to convert big decimal item {} to type {}. Using StringType as fallback",
+ dynamoBigDecimalItem.getName(), item.getClass());
+ state[0] = new StringType(dynamoBigDecimalItem.getState().toString());
+ }
+ }
+ });
+ return new DynamoDBHistoricItem(getName(), state[0], getTime());
+ }
+
+ /**
+ * We define all getter and setters in the child class implement those. Having the getter
+ * and setter implementations here in the parent class does not work with introspection done by AWS SDK (1.11.56).
+ */
+
+ /*
+ * (non-Javadoc)
+ *
+ * @see org.openhab.persistence.dynamodb.internal.DynamoItem#accept(org.openhab.persistence.dynamodb.internal.
+ * DynamoItemVisitor)
+ */
+ @Override
+ public abstract void accept(DynamoDBItemVisitor visitor);
+
+ @Override
+ public String toString() {
+ return DateFormat.getDateTimeInstance().format(time) + ": " + name + " -> " + state.toString();
+ }
+
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBBigDecimalItem.java b/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBBigDecimalItem.java
new file mode 100644
index 0000000000000..71ba808afaac6
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBBigDecimalItem.java
@@ -0,0 +1,95 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.math.BigDecimal;
+import java.math.MathContext;
+import java.util.Date;
+
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBAttribute;
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBDocument;
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBHashKey;
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBRangeKey;
+
+/**
+ * DynamoDBItem for items that can be serialized as DynamoDB number
+ *
+ * @author Sami Salonen - Initial contribution
+ */
+@DynamoDBDocument
+public class DynamoDBBigDecimalItem extends AbstractDynamoDBItem {
+
+ /**
+ * We get the following error if the BigDecimal has too many digits
+ * "Attempting to store more than 38 significant digits in a Number"
+ *
+ * See "Data types" section in
+ * http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Limits.html
+ */
+ private static final int MAX_DIGITS_SUPPORTED_BY_AMAZON = 38;
+
+ public DynamoDBBigDecimalItem() {
+ this(null, null, null);
+ }
+
+ public DynamoDBBigDecimalItem(String name, BigDecimal state, Date time) {
+ super(name, state, time);
+ }
+
+ @DynamoDBAttribute(attributeName = DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE)
+ @Override
+ public BigDecimal getState() {
+ // When serializing this to the wire, we round the number in order to ensure
+ // that it is within the dynamodb limits
+ return loseDigits(state);
+ }
+
+ @DynamoDBHashKey(attributeName = DynamoDBItem.ATTRIBUTE_NAME_ITEMNAME)
+ @Override
+ public String getName() {
+ return name;
+ }
+
+ @Override
+ @DynamoDBRangeKey(attributeName = ATTRIBUTE_NAME_TIMEUTC)
+ public Date getTime() {
+ return time;
+ }
+
+ @Override
+ public void setName(String name) {
+ this.name = name;
+ }
+
+ @Override
+ public void setState(BigDecimal state) {
+ this.state = state;
+ }
+
+ @Override
+ public void setTime(Date time) {
+ this.time = time;
+ }
+
+ @Override
+ public void accept(org.openhab.persistence.dynamodb.internal.DynamoDBItemVisitor visitor) {
+ visitor.visit(this);
+ }
+
+ static BigDecimal loseDigits(BigDecimal number) {
+ if (number == null) {
+ return null;
+ }
+ return number.round(new MathContext(MAX_DIGITS_SUPPORTED_BY_AMAZON));
+ }
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBClient.java b/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBClient.java
new file mode 100644
index 0000000000000..2f1daa14c1475
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBClient.java
@@ -0,0 +1,66 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import com.amazonaws.auth.AWSCredentials;
+import com.amazonaws.auth.AWSStaticCredentialsProvider;
+import com.amazonaws.regions.Regions;
+import com.amazonaws.services.dynamodbv2.AmazonDynamoDB;
+import com.amazonaws.services.dynamodbv2.AmazonDynamoDBClientBuilder;
+import com.amazonaws.services.dynamodbv2.document.DynamoDB;
+
+/**
+ * Shallow wrapper for Dynamo DB wrappers
+ *
+ * @author Sami Salonen - Initial contribution
+ */
+public class DynamoDBClient {
+ private static final Logger logger = LoggerFactory.getLogger(DynamoDBClient.class);
+ private DynamoDB dynamo;
+ private AmazonDynamoDB client;
+
+ public DynamoDBClient(AWSCredentials credentials, Regions region) {
+ client = AmazonDynamoDBClientBuilder.standard().withRegion(region)
+ .withCredentials(new AWSStaticCredentialsProvider(credentials)).build();
+ dynamo = new DynamoDB(client);
+ }
+
+ public DynamoDBClient(DynamoDBConfig clientConfig) {
+ this(clientConfig.getCredentials(), clientConfig.getRegion());
+ }
+
+ public AmazonDynamoDB getDynamoClient() {
+ return client;
+ }
+
+ public DynamoDB getDynamoDB() {
+ return dynamo;
+ }
+
+ public void shutdown() {
+ dynamo.shutdown();
+ }
+
+ public boolean checkConnection() {
+ try {
+ dynamo.listTables(1).firstPage();
+ } catch (Exception e) {
+ logger.warn("Got internal server error when trying to list tables: {}", e.getMessage());
+ return false;
+ }
+ return true;
+ }
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBConfig.java b/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBConfig.java
new file mode 100644
index 0000000000000..04b412799bd81
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBConfig.java
@@ -0,0 +1,207 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import static org.apache.commons.lang.StringUtils.isBlank;
+
+import java.util.Map;
+
+import org.apache.commons.lang.StringUtils;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import com.amazonaws.auth.AWSCredentials;
+import com.amazonaws.auth.BasicAWSCredentials;
+import com.amazonaws.auth.profile.ProfilesConfigFile;
+import com.amazonaws.regions.Regions;
+
+/**
+ * Configuration for DynamoDB connections
+ *
+ * @author Sami Salonen - Initial contribution
+ */
+public class DynamoDBConfig {
+ public static final String DEFAULT_TABLE_PREFIX = "openhab-";
+ public static final boolean DEFAULT_CREATE_TABLE_ON_DEMAND = true;
+ public static final long DEFAULT_READ_CAPACITY_UNITS = 1;
+ public static final long DEFAULT_WRITE_CAPACITY_UNITS = 1;
+ public static final long DEFAULT_BUFFER_COMMIT_INTERVAL_MILLIS = 1000;
+ public static final int DEFAULT_BUFFER_SIZE = 1000;
+
+ private static final Logger logger = LoggerFactory.getLogger(DynamoDBConfig.class);
+
+ private String tablePrefix = DEFAULT_TABLE_PREFIX;
+ private Regions region;
+ private AWSCredentials credentials;
+ private boolean createTable = DEFAULT_CREATE_TABLE_ON_DEMAND;
+ private long readCapacityUnits = DEFAULT_READ_CAPACITY_UNITS;
+ private long writeCapacityUnits = DEFAULT_WRITE_CAPACITY_UNITS;
+ private long bufferCommitIntervalMillis = DEFAULT_BUFFER_COMMIT_INTERVAL_MILLIS;
+ private int bufferSize = DEFAULT_BUFFER_SIZE;
+
+ /**
+ *
+ * @param config persistence service configuration
+ * @return DynamoDB configuration. Returns null in case of configuration errors
+ */
+ public static DynamoDBConfig fromConfig(Map config) {
+ if (config == null || config.isEmpty()) {
+ logger.error("Configuration not provided! At least AWS region and credentials must be provided.");
+ return null;
+ }
+
+ try {
+ String regionName = (String) config.get("region");
+ if (isBlank(regionName)) {
+ invalidRegionLogHelp(regionName);
+ return null;
+ }
+ final Regions region;
+ try {
+ region = Regions.fromName(regionName);
+ } catch (IllegalArgumentException e) {
+ invalidRegionLogHelp(regionName);
+ return null;
+ }
+
+ AWSCredentials credentials;
+ String accessKey = (String) config.get("accessKey");
+ String secretKey = (String) config.get("secretKey");
+ if (!isBlank(accessKey) && !isBlank(secretKey)) {
+ logger.debug("accessKey and secretKey specified. Using those.");
+ credentials = new BasicAWSCredentials(accessKey, secretKey);
+ } else {
+ logger.debug("accessKey and/or secretKey blank. Checking profilesConfigFile and profile.");
+ String profilesConfigFile = (String) config.get("profilesConfigFile");
+ String profile = (String) config.get("profile");
+ if (isBlank(profilesConfigFile) || isBlank(profile)) {
+ logger.error("Specify either 1) accessKey and secretKey; or 2) profilesConfigFile and "
+ + "profile for providing AWS credentials");
+ return null;
+ }
+ credentials = new ProfilesConfigFile(profilesConfigFile).getCredentials(profile);
+ }
+
+ String table = (String) config.get("tablePrefix");
+ if (isBlank(table)) {
+ logger.debug("Using default table name {}", DEFAULT_TABLE_PREFIX);
+ table = DEFAULT_TABLE_PREFIX;
+ }
+
+ final boolean createTable;
+ String createTableParam = (String) config.get("createTable");
+ if (isBlank(createTableParam)) {
+ logger.debug("Creating table on demand: {}", DEFAULT_CREATE_TABLE_ON_DEMAND);
+ createTable = DEFAULT_CREATE_TABLE_ON_DEMAND;
+ } else {
+ createTable = Boolean.parseBoolean(createTableParam);
+ }
+
+ final long readCapacityUnits;
+ String readCapacityUnitsParam = (String) config.get("readCapacityUnits");
+ if (isBlank(readCapacityUnitsParam)) {
+ logger.debug("Read capacity units: {}", DEFAULT_READ_CAPACITY_UNITS);
+ readCapacityUnits = DEFAULT_READ_CAPACITY_UNITS;
+ } else {
+ readCapacityUnits = Long.parseLong(readCapacityUnitsParam);
+ }
+
+ final long writeCapacityUnits;
+ String writeCapacityUnitsParam = (String) config.get("writeCapacityUnits");
+ if (isBlank(writeCapacityUnitsParam)) {
+ logger.debug("Write capacity units: {}", DEFAULT_WRITE_CAPACITY_UNITS);
+ writeCapacityUnits = DEFAULT_WRITE_CAPACITY_UNITS;
+ } else {
+ writeCapacityUnits = Long.parseLong(writeCapacityUnitsParam);
+ }
+
+ final long bufferCommitIntervalMillis;
+ String bufferCommitIntervalMillisParam = (String) config.get("bufferCommitIntervalMillis");
+ if (isBlank(bufferCommitIntervalMillisParam)) {
+ logger.debug("Buffer commit interval millis: {}", DEFAULT_BUFFER_COMMIT_INTERVAL_MILLIS);
+ bufferCommitIntervalMillis = DEFAULT_BUFFER_COMMIT_INTERVAL_MILLIS;
+ } else {
+ bufferCommitIntervalMillis = Long.parseLong(bufferCommitIntervalMillisParam);
+ }
+
+ final int bufferSize;
+ String bufferSizeParam = (String) config.get("bufferSize");
+ if (isBlank(bufferSizeParam)) {
+ logger.debug("Buffer size: {}", DEFAULT_BUFFER_SIZE);
+ bufferSize = DEFAULT_BUFFER_SIZE;
+ } else {
+ bufferSize = Integer.parseInt(bufferSizeParam);
+ }
+
+ return new DynamoDBConfig(region, credentials, table, createTable, readCapacityUnits, writeCapacityUnits,
+ bufferCommitIntervalMillis, bufferSize);
+ } catch (Exception e) {
+ logger.error("Error with configuration", e);
+ return null;
+ }
+ }
+
+ public DynamoDBConfig(Regions region, AWSCredentials credentials, String table, boolean createTable,
+ long readCapacityUnits, long writeCapacityUnits, long bufferCommitIntervalMillis, int bufferSize) {
+ this.region = region;
+ this.credentials = credentials;
+ this.tablePrefix = table;
+ this.createTable = createTable;
+ this.readCapacityUnits = readCapacityUnits;
+ this.writeCapacityUnits = writeCapacityUnits;
+ this.bufferCommitIntervalMillis = bufferCommitIntervalMillis;
+ this.bufferSize = bufferSize;
+ }
+
+ public AWSCredentials getCredentials() {
+ return credentials;
+ }
+
+ public String getTablePrefix() {
+ return tablePrefix;
+ }
+
+ public Regions getRegion() {
+ return region;
+ }
+
+ public boolean isCreateTable() {
+ return createTable;
+ }
+
+ public long getReadCapacityUnits() {
+ return readCapacityUnits;
+ }
+
+ public long getWriteCapacityUnits() {
+ return writeCapacityUnits;
+ }
+
+ public long getBufferCommitIntervalMillis() {
+ return bufferCommitIntervalMillis;
+ }
+
+ public int getBufferSize() {
+ return bufferSize;
+ }
+
+ private static void invalidRegionLogHelp(String region) {
+ Regions[] regions = Regions.values();
+ String[] regionNames = new String[regions.length];
+ for (int i = 0; i < regions.length; i++) {
+ regionNames[i] = regions[i].getName();
+ }
+ logger.error("Specify valid AWS region to use, got {}. Valid values include: {}", region,
+ StringUtils.join(regionNames, ','));
+ }
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBHistoricItem.java b/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBHistoricItem.java
new file mode 100644
index 0000000000000..4ae4558e427bd
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBHistoricItem.java
@@ -0,0 +1,58 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.text.DateFormat;
+import java.util.Date;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.types.State;
+
+/**
+ * This is a Java bean used to return historic items from Dynamodb.
+ *
+ * @author Sami Salonen - Initial contribution
+ */
+@NonNullByDefault
+public class DynamoDBHistoricItem implements HistoricItem {
+ private final String name;
+ private final State state;
+ private final Date timestamp;
+
+ public DynamoDBHistoricItem(String name, State state, Date timestamp) {
+ this.name = name;
+ this.state = state;
+ this.timestamp = timestamp;
+ }
+
+ @Override
+ public String getName() {
+ return name;
+ }
+
+ @Override
+ public Date getTimestamp() {
+ return timestamp;
+ }
+
+ @Override
+ public State getState() {
+ return state;
+ }
+
+ @Override
+ public String toString() {
+ return DateFormat.getDateTimeInstance().format(timestamp) + ": " + name + " -> " + state.toString();
+ }
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBItem.java b/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBItem.java
new file mode 100644
index 0000000000000..f81e8f41cde54
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBItem.java
@@ -0,0 +1,59 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.util.Date;
+
+import org.openhab.core.items.Item;
+import org.openhab.core.persistence.HistoricItem;
+
+/**
+ * Represents openHAB Item serialized in a suitable format for the database
+ *
+ * @param Type of the state as accepted by the AWS SDK.
+ *
+ * @author Sami Salonen - Initial contribution
+ */
+public interface DynamoDBItem {
+
+ static final String DATE_FORMAT = "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'";
+
+ static final String ATTRIBUTE_NAME_TIMEUTC = "timeutc";
+
+ static final String ATTRIBUTE_NAME_ITEMNAME = "itemname";
+
+ static final String ATTRIBUTE_NAME_ITEMSTATE = "itemstate";
+
+ /**
+ * Convert this AbstractDynamoItem as HistoricItem.
+ *
+ * @param item Item representing this item. Used to determine item type.
+ * @return HistoricItem representing this DynamoDBItem.
+ */
+ HistoricItem asHistoricItem(Item item);
+
+ String getName();
+
+ T getState();
+
+ Date getTime();
+
+ void setName(String name);
+
+ void setState(T state);
+
+ void setTime(Date time);
+
+ void accept(DynamoDBItemVisitor visitor);
+
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBItemVisitor.java b/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBItemVisitor.java
new file mode 100644
index 0000000000000..75909cfe54bfd
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBItemVisitor.java
@@ -0,0 +1,29 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+
+/**
+ * Visitor for DynamoDBItem
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+@NonNullByDefault
+public interface DynamoDBItemVisitor {
+
+ public void visit(DynamoDBBigDecimalItem dynamoBigDecimalItem);
+
+ public void visit(DynamoDBStringItem dynamoStringItem);
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBPersistenceService.java b/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBPersistenceService.java
new file mode 100644
index 0000000000000..d794a22e11d1e
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBPersistenceService.java
@@ -0,0 +1,567 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.util.ArrayDeque;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.Date;
+import java.util.Deque;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Locale;
+import java.util.Map;
+import java.util.Map.Entry;
+import java.util.Set;
+import java.util.concurrent.Executors;
+import java.util.concurrent.ScheduledExecutorService;
+import java.util.concurrent.ScheduledFuture;
+import java.util.concurrent.TimeUnit;
+import java.util.function.Function;
+
+import org.eclipse.jdt.annotation.NonNull;
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.core.common.NamedThreadFactory;
+import org.openhab.core.items.Item;
+import org.openhab.core.items.ItemNotFoundException;
+import org.openhab.core.items.ItemRegistry;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.persistence.PersistenceItemInfo;
+import org.openhab.core.persistence.PersistenceService;
+import org.openhab.core.persistence.QueryablePersistenceService;
+import org.openhab.core.persistence.strategy.PersistenceStrategy;
+import org.openhab.core.types.State;
+import org.osgi.framework.BundleContext;
+import org.osgi.service.component.annotations.Activate;
+import org.osgi.service.component.annotations.Component;
+import org.osgi.service.component.annotations.ConfigurationPolicy;
+import org.osgi.service.component.annotations.Deactivate;
+import org.osgi.service.component.annotations.Reference;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import com.amazonaws.AmazonClientException;
+import com.amazonaws.AmazonServiceException;
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper;
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper.FailedBatch;
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperConfig;
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperConfig.PaginationLoadingStrategy;
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBQueryExpression;
+import com.amazonaws.services.dynamodbv2.datamodeling.PaginatedQueryList;
+import com.amazonaws.services.dynamodbv2.document.BatchWriteItemOutcome;
+import com.amazonaws.services.dynamodbv2.model.CreateTableRequest;
+import com.amazonaws.services.dynamodbv2.model.GlobalSecondaryIndex;
+import com.amazonaws.services.dynamodbv2.model.ProvisionedThroughput;
+import com.amazonaws.services.dynamodbv2.model.ResourceNotFoundException;
+import com.amazonaws.services.dynamodbv2.model.TableDescription;
+import com.amazonaws.services.dynamodbv2.model.TableStatus;
+import com.amazonaws.services.dynamodbv2.model.WriteRequest;
+
+/**
+ * This is the implementation of the DynamoDB {@link PersistenceService}. It persists item values
+ * using the Amazon DynamoDB database. The states (
+ * {@link State}) of an {@link Item} are persisted in DynamoDB tables.
+ *
+ * The service creates tables automatically, one for numbers, and one for strings.
+ *
+ * @see AbstractDynamoDBItem.fromState for details how different items are persisted
+ *
+ * @author Sami Salonen - Initial contribution
+ * @author Kai Kreuzer - Migration to 3.x
+ *
+ */
+@NonNullByDefault
+@Component(service = { PersistenceService.class,
+ QueryablePersistenceService.class }, configurationPid = "org.openhab.dynamodb", configurationPolicy = ConfigurationPolicy.REQUIRE)
+public class DynamoDBPersistenceService extends AbstractBufferedPersistenceService>
+ implements QueryablePersistenceService {
+
+ private class ExponentialBackoffRetry implements Runnable {
+ private int retry;
+ private Map> unprocessedItems;
+ private @Nullable Exception lastException;
+
+ public ExponentialBackoffRetry(Map> unprocessedItems) {
+ this.unprocessedItems = unprocessedItems;
+ }
+
+ @Override
+ public void run() {
+ logger.debug("Error storing object to dynamo, unprocessed items: {}. Retrying with exponential back-off",
+ unprocessedItems);
+ lastException = null;
+ while (!unprocessedItems.isEmpty() && retry < WAIT_MILLIS_IN_RETRIES.length) {
+ if (!sleep()) {
+ // Interrupted
+ return;
+ }
+ retry++;
+ try {
+ BatchWriteItemOutcome outcome = DynamoDBPersistenceService.this.db.getDynamoDB()
+ .batchWriteItemUnprocessed(unprocessedItems);
+ unprocessedItems = outcome.getUnprocessedItems();
+ lastException = null;
+ } catch (AmazonServiceException e) {
+ if (e instanceof ResourceNotFoundException) {
+ logger.debug(
+ "DynamoDB query raised unexpected exception: {}. This might happen if table was recently created",
+ e.getMessage());
+ } else {
+ logger.debug("DynamoDB query raised unexpected exception: {}.", e.getMessage());
+ }
+ lastException = e;
+ continue;
+ }
+ }
+ if (unprocessedItems.isEmpty()) {
+ logger.debug("After {} retries successfully wrote all unprocessed items", retry);
+ } else {
+ logger.warn(
+ "Even after retries failed to write some items. Last exception: {} {}, unprocessed items: {}",
+ lastException == null ? "null" : lastException.getClass().getName(),
+ lastException == null ? "null" : lastException.getMessage(), unprocessedItems);
+ }
+ }
+
+ private boolean sleep() {
+ try {
+ long sleepTime;
+ if (retry == 1 && lastException != null && lastException instanceof ResourceNotFoundException) {
+ sleepTime = WAIT_ON_FIRST_RESOURCE_NOT_FOUND_MILLIS;
+ } else {
+ sleepTime = WAIT_MILLIS_IN_RETRIES[retry];
+ }
+ Thread.sleep(sleepTime);
+ return true;
+ } catch (InterruptedException e) {
+ logger.debug("Interrupted while writing data!");
+ return false;
+ }
+ }
+
+ public Map> getUnprocessedItems() {
+ return unprocessedItems;
+ }
+ }
+
+ private static final int WAIT_ON_FIRST_RESOURCE_NOT_FOUND_MILLIS = 5000;
+ private static final int[] WAIT_MILLIS_IN_RETRIES = new int[] { 100, 100, 200, 300, 500 };
+ private static final String DYNAMODB_THREADPOOL_NAME = "dynamodbPersistenceService";
+
+ private @NonNullByDefault({}) ItemRegistry itemRegistry;
+ private @Nullable DynamoDBClient db;
+ private final Logger logger = LoggerFactory.getLogger(DynamoDBPersistenceService.class);
+ private boolean isProperlyConfigured;
+ private @NonNullByDefault({}) DynamoDBConfig dbConfig;
+ private @NonNullByDefault({}) DynamoDBTableNameResolver tableNameResolver;
+ private final ScheduledExecutorService scheduler = Executors.newScheduledThreadPool(1,
+ new NamedThreadFactory(DYNAMODB_THREADPOOL_NAME));
+ private @Nullable ScheduledFuture> writeBufferedDataFuture;
+
+ /**
+ * For testing. Allows access to underlying DynamoDBClient.
+ *
+ * @return DynamoDBClient connected to AWS Dyanamo DB.
+ */
+ @Nullable
+ DynamoDBClient getDb() {
+ return db;
+ }
+
+ @Reference
+ public void setItemRegistry(ItemRegistry itemRegistry) {
+ this.itemRegistry = itemRegistry;
+ }
+
+ public void unsetItemRegistry(ItemRegistry itemRegistry) {
+ this.itemRegistry = null;
+ }
+
+ @Activate
+ public void activate(final @Nullable BundleContext bundleContext, final Map config) {
+ resetClient();
+ dbConfig = DynamoDBConfig.fromConfig(config);
+ if (dbConfig == null) {
+ // Configuration was invalid. Abort service activation.
+ // Error is already logger in fromConfig.
+ return;
+ }
+
+ tableNameResolver = new DynamoDBTableNameResolver(dbConfig.getTablePrefix());
+ try {
+ if (!ensureClient()) {
+ logger.error("Error creating dynamodb database client. Aborting service activation.");
+ return;
+ }
+ } catch (Exception e) {
+ logger.error("Error constructing dynamodb client", e);
+ return;
+ }
+
+ writeBufferedDataFuture = null;
+ resetWithBufferSize(dbConfig.getBufferSize());
+ long commitIntervalMillis = dbConfig.getBufferCommitIntervalMillis();
+ if (commitIntervalMillis > 0) {
+ writeBufferedDataFuture = scheduler.scheduleWithFixedDelay(new Runnable() {
+ @Override
+ public void run() {
+ DynamoDBPersistenceService.this.flushBufferedData();
+ }
+ }, 0, commitIntervalMillis, TimeUnit.MILLISECONDS);
+ }
+ isProperlyConfigured = true;
+ logger.debug("dynamodb persistence service activated");
+ }
+
+ @Deactivate
+ public void deactivate() {
+ logger.debug("dynamodb persistence service deactivated");
+ if (writeBufferedDataFuture != null) {
+ writeBufferedDataFuture.cancel(false);
+ writeBufferedDataFuture = null;
+ }
+ resetClient();
+ }
+
+ /**
+ * Initializes DynamoDBClient (db field)
+ *
+ * If DynamoDBClient constructor throws an exception, error is logged and false is returned.
+ *
+ * @return whether initialization was successful.
+ */
+ private boolean ensureClient() {
+ if (db == null) {
+ try {
+ db = new DynamoDBClient(dbConfig);
+ } catch (Exception e) {
+ logger.error("Error constructing dynamodb client", e);
+ return false;
+ }
+ }
+ return true;
+ }
+
+ @Override
+ public DynamoDBItem> persistenceItemFromState(String name, State state, Date time) {
+ return AbstractDynamoDBItem.fromState(name, state, time);
+ }
+
+ /**
+ * Create table (if not present) and wait for table to become active.
+ *
+ * Synchronized in order to ensure that at most single thread is creating the table at a time
+ *
+ * @param mapper
+ * @param dtoClass
+ * @return whether table creation succeeded.
+ */
+ private synchronized boolean createTable(DynamoDBMapper mapper, Class> dtoClass) {
+ if (db == null) {
+ return false;
+ }
+ String tableName;
+ try {
+ ProvisionedThroughput provisionedThroughput = new ProvisionedThroughput(dbConfig.getReadCapacityUnits(),
+ dbConfig.getWriteCapacityUnits());
+ CreateTableRequest request = mapper.generateCreateTableRequest(dtoClass);
+ request.setProvisionedThroughput(provisionedThroughput);
+ if (request.getGlobalSecondaryIndexes() != null) {
+ for (GlobalSecondaryIndex index : request.getGlobalSecondaryIndexes()) {
+ index.setProvisionedThroughput(provisionedThroughput);
+ }
+ }
+ tableName = request.getTableName();
+ try {
+ db.getDynamoClient().describeTable(tableName);
+ } catch (ResourceNotFoundException e) {
+ // No table present, continue with creation
+ db.getDynamoClient().createTable(request);
+ } catch (AmazonClientException e) {
+ logger.error("Table creation failed due to error in describeTable operation", e);
+ return false;
+ }
+
+ // table found or just created, wait
+ return waitForTableToBecomeActive(tableName);
+
+ } catch (AmazonClientException e) {
+ logger.error("Exception when creating table", e);
+ return false;
+ }
+
+ }
+
+ private boolean waitForTableToBecomeActive(String tableName) {
+ try {
+ logger.debug("Checking if table '{}' is created...", tableName);
+ final TableDescription tableDescription;
+ try {
+ tableDescription = db.getDynamoDB().getTable(tableName).waitForActive();
+ } catch (IllegalArgumentException e) {
+ logger.warn("Table '{}' is being deleted: {} {}", tableName, e.getClass().getSimpleName(),
+ e.getMessage());
+ return false;
+ } catch (ResourceNotFoundException e) {
+ logger.warn("Table '{}' was deleted unexpectedly: {} {}", tableName, e.getClass().getSimpleName(),
+ e.getMessage());
+ return false;
+ }
+ boolean success = TableStatus.ACTIVE.equals(TableStatus.fromValue(tableDescription.getTableStatus()));
+ if (success) {
+ logger.debug("Creation of table '{}' successful, table status is now {}", tableName,
+ tableDescription.getTableStatus());
+ } else {
+ logger.warn("Creation of table '{}' unsuccessful, table status is now {}", tableName,
+ tableDescription.getTableStatus());
+ }
+ return success;
+ } catch (AmazonClientException e) {
+ logger.error("Exception when checking table status (describe): {}", e.getMessage());
+ return false;
+ } catch (InterruptedException e) {
+ logger.error("Interrupted while trying to check table status: {}", e.getMessage());
+ return false;
+ }
+ }
+
+ private void resetClient() {
+ if (db == null) {
+ return;
+ }
+ db.shutdown();
+ db = null;
+ dbConfig = null;
+ tableNameResolver = null;
+ isProperlyConfigured = false;
+ }
+
+ private DynamoDBMapper getDBMapper(String tableName) {
+ try {
+ DynamoDBMapperConfig mapperConfig = new DynamoDBMapperConfig.Builder()
+ .withTableNameOverride(new DynamoDBMapperConfig.TableNameOverride(tableName))
+ .withPaginationLoadingStrategy(PaginationLoadingStrategy.LAZY_LOADING).build();
+ return new DynamoDBMapper(db.getDynamoClient(), mapperConfig);
+ } catch (AmazonClientException e) {
+ logger.error("Error getting db mapper: {}", e.getMessage());
+ throw e;
+ }
+ }
+
+ @Override
+ protected boolean isReadyToStore() {
+ return isProperlyConfigured && ensureClient();
+ }
+
+ @Override
+ public String getId() {
+ return "dynamodb";
+ }
+
+ @Override
+ public String getLabel(@Nullable Locale locale) {
+ return "DynamoDB";
+ }
+
+ @Override
+ public @NonNull Set<@NonNull PersistenceItemInfo> getItemInfo() {
+ return Collections.emptySet();
+ }
+
+ @Override
+ protected void flushBufferedData() {
+ if (buffer != null && buffer.isEmpty()) {
+ return;
+ }
+ logger.debug("Writing buffered data. Buffer size: {}", buffer.size());
+
+ for (;;) {
+ Map>> itemsByTable = readBuffer();
+ // Write batch of data, one table at a time
+ for (Entry>> entry : itemsByTable.entrySet()) {
+ String tableName = entry.getKey();
+ Deque> batch = entry.getValue();
+ if (!batch.isEmpty()) {
+ flushBatch(getDBMapper(tableName), batch);
+ }
+ }
+ if (buffer != null && buffer.isEmpty()) {
+ break;
+ }
+ }
+ }
+
+ private Map>> readBuffer() {
+ Map>> batchesByTable = new HashMap>>(2);
+ // Get batch of data
+ while (!buffer.isEmpty()) {
+ DynamoDBItem> dynamoItem = buffer.poll();
+ if (dynamoItem == null) {
+ break;
+ }
+ String tableName = tableNameResolver.fromItem(dynamoItem);
+ Deque> batch = batchesByTable.computeIfAbsent(tableName,
+ new Function>>() {
+ @Override
+ public Deque> apply(String t) {
+ return new ArrayDeque>();
+ }
+ });
+ batch.add(dynamoItem);
+ }
+ return batchesByTable;
+ }
+
+ /**
+ * Flush batch of data to DynamoDB
+ *
+ * @param mapper mapper associated with the batch
+ * @param batch batch of data to write to DynamoDB
+ */
+ private void flushBatch(DynamoDBMapper mapper, Deque> batch) {
+ long currentTimeMillis = System.currentTimeMillis();
+ List failed = mapper.batchSave(batch);
+ for (FailedBatch failedBatch : failed) {
+ if (failedBatch.getException() instanceof ResourceNotFoundException) {
+ // Table did not exist. Try writing everything again.
+ retryFlushAfterCreatingTable(mapper, batch, failedBatch);
+ break;
+ } else {
+ logger.debug("Batch failed with {}. Retrying next with exponential back-off",
+ failedBatch.getException().getMessage());
+ new ExponentialBackoffRetry(failedBatch.getUnprocessedItems()).run();
+ }
+ }
+ if (failed.isEmpty()) {
+ logger.debug("flushBatch ended with {} items in {} ms: {}", batch.size(),
+ System.currentTimeMillis() - currentTimeMillis, batch);
+ } else {
+ logger.warn(
+ "flushBatch ended with {} items in {} ms: {}. There were some failed batches that were retried -- check logs for ERRORs to see if writes were successful",
+ batch.size(), System.currentTimeMillis() - currentTimeMillis, batch);
+ }
+ }
+
+ /**
+ * Retry flushing data after creating table associated with mapper
+ *
+ * @param mapper mapper associated with the batch
+ * @param batch original batch of data. Used for logging and to determine table name
+ * @param failedBatch failed batch that should be retried
+ */
+ private void retryFlushAfterCreatingTable(DynamoDBMapper mapper, Deque> batch,
+ FailedBatch failedBatch) {
+ logger.debug("Table was not found. Trying to create table and try saving again");
+ if (createTable(mapper, batch.peek().getClass())) {
+ logger.debug("Table creation successful, trying to save again");
+ if (!failedBatch.getUnprocessedItems().isEmpty()) {
+ ExponentialBackoffRetry retry = new ExponentialBackoffRetry(failedBatch.getUnprocessedItems());
+ retry.run();
+ if (retry.getUnprocessedItems().isEmpty()) {
+ logger.debug("Successfully saved items after table creation");
+ }
+ }
+ } else {
+ logger.warn("Table creation failed. Not storing some parts of batch: {}. Unprocessed items: {}", batch,
+ failedBatch.getUnprocessedItems());
+ }
+ }
+
+ /**
+ * {@inheritDoc}
+ */
+ @Override
+ public Iterable query(FilterCriteria filter) {
+ logger.debug("got a query");
+ if (!isProperlyConfigured) {
+ logger.warn("Configuration for dynamodb not yet loaded or broken. Not storing item.");
+ return Collections.emptyList();
+ }
+ if (!ensureClient()) {
+ logger.warn("DynamoDB not connected. Not storing item.");
+ return Collections.emptyList();
+ }
+
+ String itemName = filter.getItemName();
+ Item item = getItemFromRegistry(itemName);
+ if (item == null) {
+ logger.warn("Could not get item {} from registry!", itemName);
+ return Collections.emptyList();
+ }
+
+ Class> dtoClass = AbstractDynamoDBItem.getDynamoItemClass(item.getClass());
+ String tableName = tableNameResolver.fromClass(dtoClass);
+ DynamoDBMapper mapper = getDBMapper(tableName);
+ logger.debug("item {} (class {}) will be tried to query using dto class {} from table {}", itemName,
+ item.getClass(), dtoClass, tableName);
+
+ List historicItems = new ArrayList();
+
+ DynamoDBQueryExpression> queryExpression = DynamoDBQueryUtils.createQueryExpression(dtoClass,
+ filter);
+ @SuppressWarnings("rawtypes")
+ final PaginatedQueryList extends DynamoDBItem> paginatedList;
+ try {
+ paginatedList = mapper.query(dtoClass, queryExpression);
+ } catch (AmazonServiceException e) {
+ logger.error(
+ "DynamoDB query raised unexpected exception: {}. Returning empty collection. "
+ + "Status code 400 (resource not found) might occur if table was just created.",
+ e.getMessage());
+ return Collections.emptyList();
+ }
+ for (int itemIndexOnPage = 0; itemIndexOnPage < filter.getPageSize(); itemIndexOnPage++) {
+ int itemIndex = filter.getPageNumber() * filter.getPageSize() + itemIndexOnPage;
+ DynamoDBItem> dynamoItem;
+ try {
+ dynamoItem = paginatedList.get(itemIndex);
+ } catch (IndexOutOfBoundsException e) {
+ logger.debug("Index {} is out-of-bounds", itemIndex);
+ break;
+ }
+ if (dynamoItem != null) {
+ HistoricItem historicItem = dynamoItem.asHistoricItem(item);
+ logger.trace("Dynamo item {} converted to historic item: {}", item, historicItem);
+ historicItems.add(historicItem);
+ }
+
+ }
+ return historicItems;
+ }
+
+ /**
+ * Retrieves the item for the given name from the item registry
+ *
+ * @param itemName
+ * @return item with the given name, or null if no such item exists in item registry.
+ */
+ private @Nullable Item getItemFromRegistry(String itemName) {
+ Item item = null;
+ try {
+ if (itemRegistry != null) {
+ item = itemRegistry.getItem(itemName);
+ }
+ } catch (ItemNotFoundException e1) {
+ logger.error("Unable to get item {} from registry", itemName);
+ }
+ return item;
+ }
+
+ @Override
+ public List getDefaultStrategies() {
+ return List.of(PersistenceStrategy.Globals.RESTORE, PersistenceStrategy.Globals.CHANGE);
+ }
+
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBQueryUtils.java b/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBQueryUtils.java
new file mode 100644
index 0000000000000..5532935a00bdf
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBQueryUtils.java
@@ -0,0 +1,149 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.util.Collections;
+import java.util.Date;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.FilterCriteria.Operator;
+import org.openhab.core.persistence.FilterCriteria.Ordering;
+
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBQueryExpression;
+import com.amazonaws.services.dynamodbv2.model.AttributeValue;
+import com.amazonaws.services.dynamodbv2.model.ComparisonOperator;
+import com.amazonaws.services.dynamodbv2.model.Condition;
+
+/**
+ * Utility class
+ *
+ * @author Sami Salonen - Initial contribution
+ */
+@NonNullByDefault
+public class DynamoDBQueryUtils {
+ /**
+ * Construct dynamodb query from filter
+ *
+ * @param filter
+ * @return DynamoDBQueryExpression corresponding to the given FilterCriteria
+ */
+ public static DynamoDBQueryExpression> createQueryExpression(
+ Class extends DynamoDBItem>> dtoClass, FilterCriteria filter) {
+ DynamoDBItem> item = getDynamoDBHashKey(dtoClass, filter.getItemName());
+ final DynamoDBQueryExpression> queryExpression = new DynamoDBQueryExpression>()
+ .withHashKeyValues(item).withScanIndexForward(filter.getOrdering() == Ordering.ASCENDING)
+ .withLimit(filter.getPageSize());
+ maybeAddTimeFilter(queryExpression, filter);
+ maybeAddStateFilter(filter, queryExpression);
+ return queryExpression;
+ }
+
+ private static DynamoDBItem> getDynamoDBHashKey(Class extends DynamoDBItem>> dtoClass, String itemName) {
+ DynamoDBItem> item;
+ try {
+ item = dtoClass.newInstance();
+ } catch (InstantiationException e) {
+ throw new RuntimeException(e);
+ } catch (IllegalAccessException e) {
+ throw new RuntimeException(e);
+ }
+ item.setName(itemName);
+ return item;
+ }
+
+ private static void maybeAddStateFilter(FilterCriteria filter,
+ final DynamoDBQueryExpression> queryExpression) {
+ if (filter.getOperator() != null && filter.getState() != null) {
+ // Convert filter's state to DynamoDBItem in order get suitable string representation for the state
+ final DynamoDBItem> filterState = AbstractDynamoDBItem.fromState(filter.getItemName(), filter.getState(),
+ new Date());
+ queryExpression.setFilterExpression(String.format("%s %s :opstate", DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE,
+ operatorAsString(filter.getOperator())));
+
+ filterState.accept(new DynamoDBItemVisitor() {
+
+ @Override
+ public void visit(DynamoDBStringItem dynamoStringItem) {
+ queryExpression.setExpressionAttributeValues(Collections.singletonMap(":opstate",
+ new AttributeValue().withS(dynamoStringItem.getState())));
+ }
+
+ @Override
+ public void visit(DynamoDBBigDecimalItem dynamoBigDecimalItem) {
+ queryExpression.setExpressionAttributeValues(Collections.singletonMap(":opstate",
+ new AttributeValue().withN(dynamoBigDecimalItem.getState().toPlainString())));
+ }
+ });
+
+ }
+ }
+
+ private static @Nullable Condition maybeAddTimeFilter(
+ final DynamoDBQueryExpression> queryExpression, final FilterCriteria filter) {
+ final Condition timeCondition = constructTimeCondition(filter);
+ if (timeCondition != null) {
+ queryExpression.setRangeKeyConditions(
+ Collections.singletonMap(DynamoDBItem.ATTRIBUTE_NAME_TIMEUTC, timeCondition));
+ }
+ return timeCondition;
+ }
+
+ private static @Nullable Condition constructTimeCondition(FilterCriteria filter) {
+ boolean hasBegin = filter.getBeginDate() != null;
+ boolean hasEnd = filter.getEndDate() != null;
+
+ final Condition timeCondition;
+ if (!hasBegin && !hasEnd) {
+ timeCondition = null;
+ } else if (!hasBegin && hasEnd) {
+ timeCondition = new Condition().withComparisonOperator(ComparisonOperator.LE).withAttributeValueList(
+ new AttributeValue().withS(AbstractDynamoDBItem.DATEFORMATTER.format(filter.getEndDate())));
+ } else if (hasBegin && !hasEnd) {
+ timeCondition = new Condition().withComparisonOperator(ComparisonOperator.GE).withAttributeValueList(
+ new AttributeValue().withS(AbstractDynamoDBItem.DATEFORMATTER.format(filter.getBeginDate())));
+ } else {
+ timeCondition = new Condition().withComparisonOperator(ComparisonOperator.BETWEEN).withAttributeValueList(
+ new AttributeValue().withS(AbstractDynamoDBItem.DATEFORMATTER.format(filter.getBeginDate())),
+ new AttributeValue().withS(AbstractDynamoDBItem.DATEFORMATTER.format(filter.getEndDate())));
+ }
+ return timeCondition;
+ }
+
+ /**
+ * Convert op to string suitable for dynamodb filter expression
+ *
+ * @param op
+ * @return string representation corresponding to the given the Operator
+ */
+ private static String operatorAsString(Operator op) {
+ switch (op) {
+ case EQ:
+ return "=";
+ case NEQ:
+ return "<>";
+ case LT:
+ return "<";
+ case LTE:
+ return "<=";
+ case GT:
+ return ">";
+ case GTE:
+ return ">=";
+
+ default:
+ throw new IllegalStateException("Unknown operator " + op);
+ }
+ }
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBStringItem.java b/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBStringItem.java
new file mode 100644
index 0000000000000..2eaf7fbeead85
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBStringItem.java
@@ -0,0 +1,75 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.util.Date;
+
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBAttribute;
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBDocument;
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBHashKey;
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBRangeKey;
+
+/**
+ * DynamoDBItem for items that can be serialized as DynamoDB string
+ *
+ * @author Sami Salonen - Initial contribution
+ */
+@DynamoDBDocument
+public class DynamoDBStringItem extends AbstractDynamoDBItem {
+
+ public DynamoDBStringItem() {
+ this(null, null, null);
+ }
+
+ public DynamoDBStringItem(String name, String state, Date time) {
+ super(name, state, time);
+ }
+
+ @DynamoDBAttribute(attributeName = DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE)
+ @Override
+ public String getState() {
+ return state;
+ }
+
+ @DynamoDBHashKey(attributeName = DynamoDBItem.ATTRIBUTE_NAME_ITEMNAME)
+ @Override
+ public String getName() {
+ return name;
+ }
+
+ @Override
+ @DynamoDBRangeKey(attributeName = ATTRIBUTE_NAME_TIMEUTC)
+ public Date getTime() {
+ return time;
+ }
+
+ @Override
+ public void accept(org.openhab.persistence.dynamodb.internal.DynamoDBItemVisitor visitor) {
+ visitor.visit(this);
+ }
+
+ @Override
+ public void setName(String name) {
+ this.name = name;
+ }
+
+ @Override
+ public void setState(String state) {
+ this.state = state;
+ }
+
+ @Override
+ public void setTime(Date time) {
+ this.time = time;
+ }
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBTableNameResolver.java b/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBTableNameResolver.java
new file mode 100644
index 0000000000000..39427ba17bb9a
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/main/java/org/openhab/persistence/dynamodb/internal/DynamoDBTableNameResolver.java
@@ -0,0 +1,66 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+/**
+ * The DynamoDBTableNameResolver resolves DynamoDB table name for a given item.
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+public class DynamoDBTableNameResolver {
+
+ private final String tablePrefix;
+
+ public DynamoDBTableNameResolver(String tablePrefix) {
+ this.tablePrefix = tablePrefix;
+ }
+
+ public String fromItem(DynamoDBItem> item) {
+ final String[] tableName = new String[1];
+
+ // Use the visitor pattern to deduce the table name
+ item.accept(new DynamoDBItemVisitor() {
+
+ @Override
+ public void visit(DynamoDBBigDecimalItem dynamoBigDecimalItem) {
+ tableName[0] = tablePrefix + "bigdecimal";
+ }
+
+ @Override
+ public void visit(DynamoDBStringItem dynamoStringItem) {
+ tableName[0] = tablePrefix + "string";
+ }
+ });
+ return tableName[0];
+ }
+
+ /**
+ * Construct DynamoDBTableNameResolver corresponding to DynamoDBItem class
+ *
+ * @param clazz
+ * @return
+ */
+ public String fromClass(Class extends DynamoDBItem>> clazz) {
+ DynamoDBItem> dummy;
+ try {
+ // Construct new instance of this class (assuming presense no-argument constructor)
+ // in order to re-use fromItem(DynamoDBItem) constructor
+ dummy = clazz.getConstructor().newInstance();
+ } catch (Exception e) {
+ throw new IllegalStateException(String.format("Could not find suitable constructor for class %s", clazz));
+ }
+ return this.fromItem(dummy);
+ }
+
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/AbstractDynamoDBItemGetDynamoItemClassTest.java b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/AbstractDynamoDBItemGetDynamoItemClassTest.java
new file mode 100644
index 0000000000000..51976234dbe46
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/AbstractDynamoDBItemGetDynamoItemClassTest.java
@@ -0,0 +1,87 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import static org.junit.Assert.assertEquals;
+
+import java.io.IOException;
+
+import org.junit.Test;
+import org.openhab.core.library.items.CallItem;
+import org.openhab.core.library.items.ColorItem;
+import org.openhab.core.library.items.ContactItem;
+import org.openhab.core.library.items.DateTimeItem;
+import org.openhab.core.library.items.DimmerItem;
+import org.openhab.core.library.items.LocationItem;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.core.library.items.RollershutterItem;
+import org.openhab.core.library.items.StringItem;
+import org.openhab.core.library.items.SwitchItem;
+
+/**
+ * Test for AbstractDynamoDBItem.getDynamoItemClass
+ *
+ * @author Sami Salonen - Initial contribution
+ */
+public class AbstractDynamoDBItemGetDynamoItemClassTest {
+
+ @Test
+ public void testCallItem() throws IOException {
+ assertEquals(DynamoDBStringItem.class, AbstractDynamoDBItem.getDynamoItemClass(CallItem.class));
+ }
+
+ @Test
+ public void testContactItem() throws IOException {
+ assertEquals(DynamoDBBigDecimalItem.class, AbstractDynamoDBItem.getDynamoItemClass(ContactItem.class));
+ }
+
+ @Test
+ public void testDateTimeItem() throws IOException {
+ assertEquals(DynamoDBStringItem.class, AbstractDynamoDBItem.getDynamoItemClass(DateTimeItem.class));
+ }
+
+ @Test
+ public void testStringItem() throws IOException {
+ assertEquals(DynamoDBStringItem.class, AbstractDynamoDBItem.getDynamoItemClass(StringItem.class));
+ }
+
+ @Test
+ public void testLocationItem() throws IOException {
+ assertEquals(DynamoDBStringItem.class, AbstractDynamoDBItem.getDynamoItemClass(LocationItem.class));
+ }
+
+ @Test
+ public void testNumberItem() throws IOException {
+ assertEquals(DynamoDBBigDecimalItem.class, AbstractDynamoDBItem.getDynamoItemClass(NumberItem.class));
+ }
+
+ @Test
+ public void testColorItem() throws IOException {
+ assertEquals(DynamoDBStringItem.class, AbstractDynamoDBItem.getDynamoItemClass(ColorItem.class));
+ }
+
+ @Test
+ public void testDimmerItem() throws IOException {
+ assertEquals(DynamoDBBigDecimalItem.class, AbstractDynamoDBItem.getDynamoItemClass(DimmerItem.class));
+ }
+
+ @Test
+ public void testRollershutterItem() throws IOException {
+ assertEquals(DynamoDBBigDecimalItem.class, AbstractDynamoDBItem.getDynamoItemClass(RollershutterItem.class));
+ }
+
+ @Test
+ public void testOnOffTypeWithSwitchItem() throws IOException {
+ assertEquals(DynamoDBBigDecimalItem.class, AbstractDynamoDBItem.getDynamoItemClass(SwitchItem.class));
+ }
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/AbstractDynamoDBItemSerializationTest.java b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/AbstractDynamoDBItemSerializationTest.java
new file mode 100644
index 0000000000000..45a78bcea9975
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/AbstractDynamoDBItemSerializationTest.java
@@ -0,0 +1,270 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import static org.junit.Assert.*;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.util.Calendar;
+import java.util.Date;
+import java.util.TimeZone;
+
+import org.apache.commons.lang.StringUtils;
+import org.junit.Test;
+import org.openhab.core.items.Item;
+import org.openhab.core.library.items.CallItem;
+import org.openhab.core.library.items.ColorItem;
+import org.openhab.core.library.items.ContactItem;
+import org.openhab.core.library.items.DateTimeItem;
+import org.openhab.core.library.items.DimmerItem;
+import org.openhab.core.library.items.LocationItem;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.core.library.items.RollershutterItem;
+import org.openhab.core.library.items.StringItem;
+import org.openhab.core.library.items.SwitchItem;
+import org.openhab.core.library.types.DateTimeType;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.library.types.HSBType;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.library.types.OpenClosedType;
+import org.openhab.core.library.types.PercentType;
+import org.openhab.core.library.types.PointType;
+import org.openhab.core.library.types.StringListType;
+import org.openhab.core.library.types.StringType;
+import org.openhab.core.library.types.UpDownType;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.types.State;
+import org.openhab.core.types.UnDefType;
+
+/**
+ * Test for AbstractDynamoDBItem.fromState and AbstractDynamoDBItem.asHistoricItem for all kind of states
+ *
+ * @author Sami Salonen - Initial contribution
+ */
+public class AbstractDynamoDBItemSerializationTest {
+
+ private final Date date = new Date(400);
+
+ /**
+ * Generic function testing serialization of item state to internal format in DB. In other words, conversion of
+ * Item with state to DynamoDBItem
+ *
+ * @param state item state
+ * @param expectedState internal format in DB representing the item state
+ * @return dynamo db item
+ * @throws IOException
+ */
+ public DynamoDBItem> testStateGeneric(State state, Object expectedState) throws IOException {
+ DynamoDBItem> dbItem = AbstractDynamoDBItem.fromState("item1", state, date);
+
+ assertEquals("item1", dbItem.getName());
+ assertEquals(date, dbItem.getTime());
+ Object actualState = dbItem.getState();
+ if (expectedState instanceof BigDecimal) {
+ BigDecimal expectedRounded = DynamoDBBigDecimalItem.loseDigits(((BigDecimal) expectedState));
+ assertTrue(
+ String.format("Expected state %s (%s but with some digits lost) did not match actual state %s",
+ expectedRounded, expectedState, actualState),
+ expectedRounded.compareTo((BigDecimal) actualState) == 0);
+ } else {
+ assertEquals(expectedState, actualState);
+ }
+ return dbItem;
+ }
+
+ /**
+ * Test state deserialization, that is DynamoDBItem conversion to HistoricItem
+ *
+ * @param dbItem dynamo db item
+ * @param item parameter for DynamoDBItem.asHistoricItem
+ * @param expectedState Expected state of the historic item. DecimalTypes are compared with reduced accuracy
+ * @return
+ * @throws IOException
+ */
+ public HistoricItem testAsHistoricGeneric(DynamoDBItem> dbItem, Item item, Object expectedState)
+ throws IOException {
+ HistoricItem historicItem = dbItem.asHistoricItem(item);
+
+ assertEquals("item1", historicItem.getName());
+ assertEquals(date, historicItem.getTimestamp());
+ assertEquals(expectedState.getClass(), historicItem.getState().getClass());
+ if (expectedState instanceof DecimalType) {
+ // serialization loses accuracy, take this into consideration
+ BigDecimal expectedRounded = DynamoDBBigDecimalItem
+ .loseDigits(((DecimalType) expectedState).toBigDecimal());
+ BigDecimal actual = ((DecimalType) historicItem.getState()).toBigDecimal();
+ assertTrue(String.format("Expected state %s (%s but with some digits lost) did not match actual state %s",
+ expectedRounded, expectedState, actual), expectedRounded.compareTo(actual) == 0);
+ } else {
+ assertEquals(expectedState, historicItem.getState());
+ }
+ return historicItem;
+ }
+
+ @Test
+ public void testUndefWithNumberItem() throws IOException {
+ final DynamoDBItem> dbitem = testStateGeneric(UnDefType.UNDEF, "");
+ assertTrue(dbitem instanceof DynamoDBStringItem);
+ testAsHistoricGeneric(dbitem, new NumberItem("foo"), UnDefType.UNDEF);
+ }
+
+ @Test
+ public void testCallTypeWithCallItem() throws IOException {
+ final DynamoDBItem> dbitem = testStateGeneric(new StringListType("origNum", "destNum"), "origNum,destNum");
+ testAsHistoricGeneric(dbitem, new CallItem("foo"), new StringListType("origNum", "destNum"));
+
+ }
+
+ @Test
+ public void testOpenClosedTypeWithContactItem() throws IOException {
+ final DynamoDBItem> dbitemOpen = testStateGeneric(OpenClosedType.CLOSED, BigDecimal.ZERO);
+ testAsHistoricGeneric(dbitemOpen, new ContactItem("foo"), OpenClosedType.CLOSED);
+
+ final DynamoDBItem> dbitemClosed = testStateGeneric(OpenClosedType.OPEN, BigDecimal.ONE);
+ testAsHistoricGeneric(dbitemClosed, new ContactItem("foo"), OpenClosedType.OPEN);
+ }
+
+ @Test
+ public void testDateTimeTypeWithDateTimeItem() throws IOException {
+ Calendar calendar = Calendar.getInstance(TimeZone.getTimeZone("UTC"));
+ calendar.set(2016, Calendar.MAY, 1, 13, 46, 0);
+ calendar.set(Calendar.MILLISECOND, 50);
+ DynamoDBItem> dbitem = testStateGeneric(new DateTimeType(calendar), "2016-05-01T13:46:00.050Z");
+ testAsHistoricGeneric(dbitem, new DateTimeItem("foo"), new DateTimeType(calendar));
+ }
+
+ @Test
+ public void testDateTimeTypeWithStringItem() throws IOException {
+ Calendar calendar = Calendar.getInstance(TimeZone.getTimeZone("UTC"));
+ calendar.set(2016, Calendar.MAY, 1, 13, 46, 0);
+ calendar.set(Calendar.MILLISECOND, 50);
+ DynamoDBItem> dbitem = testStateGeneric(new DateTimeType(calendar), "2016-05-01T13:46:00.050Z");
+ testAsHistoricGeneric(dbitem, new StringItem("foo"), new StringType("2016-05-01T13:46:00.050Z"));
+ }
+
+ @Test
+ public void testDateTimeTypeLocalWithDateTimeItem() throws IOException {
+ Calendar calendar = Calendar.getInstance();
+ calendar.setTimeZone(TimeZone.getTimeZone("GMT+03:00"));
+ calendar.setTimeInMillis(1468773487050L); // GMT: Sun, 17 Jul 2016 16:38:07.050 GMT
+ DynamoDBItem> dbitem = testStateGeneric(new DateTimeType(calendar), "2016-07-17T16:38:07.050Z");
+
+ // when deserializing data, we get the date in UTC Calendar
+ Calendar expectedCal = Calendar.getInstance(TimeZone.getTimeZone("UTC"));
+ expectedCal.setTimeInMillis(1468773487050L);
+ testAsHistoricGeneric(dbitem, new DateTimeItem("foo"), new DateTimeType(expectedCal));
+ }
+
+ @Test
+ public void testDateTimeTypeLocalWithStringItem() throws IOException {
+ Calendar calendar = Calendar.getInstance();
+ calendar.setTimeZone(TimeZone.getTimeZone("GMT+03:00"));
+ calendar.setTimeInMillis(1468773487050L); // GMT: Sun, 17 Jul 2016 16:38:07.050 GMT
+ DynamoDBItem> dbitem = testStateGeneric(new DateTimeType(calendar), "2016-07-17T16:38:07.050Z");
+ testAsHistoricGeneric(dbitem, new StringItem("foo"), new StringType("2016-07-17T16:38:07.050Z"));
+ }
+
+ @Test
+ public void testPointTypeWithLocationItem() throws IOException {
+ final PointType point = new PointType(new DecimalType(60.3), new DecimalType(30.2), new DecimalType(510.90));
+ String expected = StringUtils.join(
+ new String[] { point.getLatitude().toBigDecimal().toString(),
+ point.getLongitude().toBigDecimal().toString(), point.getAltitude().toBigDecimal().toString() },
+ ",");
+ DynamoDBItem> dbitem = testStateGeneric(point, expected);
+ testAsHistoricGeneric(dbitem, new LocationItem("foo"), point);
+ }
+
+ @Test
+ public void testDecimalTypeWithNumberItem() throws IOException {
+ DynamoDBItem> dbitem = testStateGeneric(new DecimalType("3.2"), new BigDecimal("3.2"));
+ testAsHistoricGeneric(dbitem, new NumberItem("foo"), new DecimalType("3.2"));
+ }
+
+ @Test
+ public void testPercentTypeWithColorItem() throws IOException {
+ DynamoDBItem> dbitem = testStateGeneric(new PercentType(new BigDecimal("3.2")), new BigDecimal("3.2"));
+ testAsHistoricGeneric(dbitem, new ColorItem("foo"), new PercentType(new BigDecimal("3.2")));
+ }
+
+ @Test
+ public void testPercentTypeWithDimmerItem() throws IOException {
+ DynamoDBItem> dbitem = testStateGeneric(new PercentType(new BigDecimal("3.2")), new BigDecimal("3.2"));
+ testAsHistoricGeneric(dbitem, new DimmerItem("foo"), new PercentType(new BigDecimal("3.2")));
+ }
+
+ @Test
+ public void testPercentTypeWithRollerShutterItem() throws IOException {
+ DynamoDBItem> dbitem = testStateGeneric(new PercentType(new BigDecimal("3.2")), new BigDecimal("3.2"));
+ testAsHistoricGeneric(dbitem, new RollershutterItem("foo"), new PercentType(new BigDecimal("3.2")));
+ }
+
+ @Test
+ public void testPercentTypeWithNumberItem() throws IOException {
+ DynamoDBItem> dbitem = testStateGeneric(new PercentType(new BigDecimal("3.2")), new BigDecimal("3.2"));
+ // note: comes back as DecimalType instead of the original PercentType
+ testAsHistoricGeneric(dbitem, new NumberItem("foo"), new DecimalType(new BigDecimal("3.2")));
+ }
+
+ @Test
+ public void testUpDownTypeWithRollershutterItem() throws IOException {
+ // note: comes back as PercentType instead of the original UpDownType
+ DynamoDBItem> dbItemDown = testStateGeneric(UpDownType.DOWN, BigDecimal.ZERO);
+ testAsHistoricGeneric(dbItemDown, new RollershutterItem("foo"), new PercentType(BigDecimal.ZERO));
+
+ DynamoDBItem> dbItemUp = testStateGeneric(UpDownType.UP, BigDecimal.ONE);
+ testAsHistoricGeneric(dbItemUp, new RollershutterItem("foo"), new PercentType(BigDecimal.ONE));
+ }
+
+ @Test
+ public void testStringTypeWithStringItem() throws IOException {
+ DynamoDBItem> dbitem = testStateGeneric(new StringType("foo bar"), "foo bar");
+ testAsHistoricGeneric(dbitem, new StringItem("foo"), new StringType("foo bar"));
+ }
+
+ @Test
+ public void testOnOffTypeWithColorItem() throws IOException {
+ DynamoDBItem> dbitemOff = testStateGeneric(OnOffType.OFF, BigDecimal.ZERO);
+ testAsHistoricGeneric(dbitemOff, new ColorItem("foo"), new PercentType(BigDecimal.ZERO));
+
+ DynamoDBItem> dbitemOn = testStateGeneric(OnOffType.ON, BigDecimal.ONE);
+ testAsHistoricGeneric(dbitemOn, new ColorItem("foo"), new PercentType(BigDecimal.ONE));
+ }
+
+ @Test
+ public void testOnOffTypeWithDimmerItem() throws IOException {
+ DynamoDBItem> dbitemOff = testStateGeneric(OnOffType.OFF, BigDecimal.ZERO);
+ testAsHistoricGeneric(dbitemOff, new DimmerItem("foo"), new PercentType(BigDecimal.ZERO));
+
+ DynamoDBItem> dbitemOn = testStateGeneric(OnOffType.ON, BigDecimal.ONE);
+ testAsHistoricGeneric(dbitemOn, new DimmerItem("foo"), new PercentType(BigDecimal.ONE));
+ }
+
+ @Test
+ public void testOnOffTypeWithSwitchItem() throws IOException {
+ DynamoDBItem> dbitemOff = testStateGeneric(OnOffType.OFF, BigDecimal.ZERO);
+ testAsHistoricGeneric(dbitemOff, new SwitchItem("foo"), OnOffType.OFF);
+
+ DynamoDBItem> dbitemOn = testStateGeneric(OnOffType.ON, BigDecimal.ONE);
+ testAsHistoricGeneric(dbitemOn, new SwitchItem("foo"), OnOffType.ON);
+ }
+
+ @Test
+ public void testHSBTypeWithColorItem() throws IOException {
+ HSBType hsb = new HSBType(new DecimalType(1.5), new PercentType(new BigDecimal(2.5)),
+ new PercentType(new BigDecimal(3.5)));
+ DynamoDBItem> dbitem = testStateGeneric(hsb, "1.5,2.5,3.5");
+ testAsHistoricGeneric(dbitem, new ColorItem("foo"), hsb);
+ }
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/AbstractTwoItemIntegrationTest.java b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/AbstractTwoItemIntegrationTest.java
new file mode 100644
index 0000000000000..3a65c35a994a7
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/AbstractTwoItemIntegrationTest.java
@@ -0,0 +1,352 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import static org.junit.Assert.*;
+
+import java.util.Date;
+import java.util.Iterator;
+
+import org.junit.Assume;
+import org.junit.BeforeClass;
+import org.junit.Test;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.FilterCriteria.Operator;
+import org.openhab.core.persistence.FilterCriteria.Ordering;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.types.State;
+
+/**
+ * This is abstract class helping with integration testing the persistence service. Different kind of queries are tested
+ * against actual dynamo db database.
+ *
+ *
+ * Inheritor of this base class needs to store two states of one item in a static method annotated with @BeforeClass.
+ * This
+ * static
+ * class should update the private static fields
+ * beforeStore (date before storing anything), afterStore1 (after storing first item, but before storing second item),
+ * afterStore2 (after storing second item). The item name must correspond to getItemName. The first state needs to be
+ * smaller than the second state.
+ *
+ * To have more comprehensive tests, the inheritor class can define getQueryItemStateBetween to provide a value between
+ * the two states. Null can be used to omit the additional tests.
+ *
+ *
+ * See DimmerItemIntegrationTest for example how to use this base class.
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+public abstract class AbstractTwoItemIntegrationTest extends BaseIntegrationTest {
+
+ protected static Date beforeStore;
+ protected static Date afterStore1;
+ protected static Date afterStore2;
+
+ protected abstract String getItemName();
+
+ /**
+ * State of the time item stored first, should be smaller than the second value
+ *
+ * @return
+ */
+ protected abstract State getFirstItemState();
+
+ /**
+ * State of the time item stored second, should be larger than the first value
+ *
+ * @return
+ */
+ protected abstract State getSecondItemState();
+
+ /**
+ * State that is between the first and second. Use null to omit extended tests using this value.
+ *
+ * @return
+ */
+ protected abstract State getQueryItemStateBetween();
+
+ protected void assertStateEquals(State expected, State actual) {
+ assertEquals(expected, actual);
+ }
+
+ @BeforeClass
+ public static void checkService() throws InterruptedException {
+ String msg = "DynamoDB integration tests will be skipped. Did you specify AWS credentials for testing? "
+ + "See BaseIntegrationTest for more details";
+ if (service == null) {
+ System.out.println(msg);
+ }
+ Assume.assumeTrue(msg, service != null);
+ }
+
+ /**
+ * Asserts that iterable contains correct items and nothing else
+ *
+ */
+ private void assertIterableContainsItems(Iterable iterable, boolean ascending) {
+ Iterator iterator = iterable.iterator();
+ HistoricItem actual1 = iterator.next();
+ HistoricItem actual2 = iterator.next();
+ assertFalse(iterator.hasNext());
+
+ for (HistoricItem actual : new HistoricItem[] { actual1, actual2 }) {
+ assertEquals(getItemName(), actual.getName());
+ }
+ HistoricItem storedFirst;
+ HistoricItem storedSecond;
+ if (ascending) {
+ storedFirst = actual1;
+ storedSecond = actual2;
+ } else {
+ storedFirst = actual2;
+ storedSecond = actual1;
+ }
+
+ assertStateEquals(getFirstItemState(), storedFirst.getState());
+ assertTrue(storedFirst.getTimestamp().before(afterStore1));
+ assertTrue(storedFirst.getTimestamp().after(beforeStore));
+
+ assertStateEquals(getSecondItemState(), storedSecond.getState());
+ assertTrue(storedSecond.getTimestamp().before(afterStore2));
+ assertTrue(storedSecond.getTimestamp().after(afterStore1));
+ }
+
+ @Test
+ public void testQueryUsingName() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOrdering(Ordering.ASCENDING);
+ criteria.setItemName(getItemName());
+ Iterable iterable = BaseIntegrationTest.service.query(criteria);
+ assertIterableContainsItems(iterable, true);
+ }
+
+ @Test
+ public void testQueryUsingNameAndStart() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOrdering(Ordering.ASCENDING);
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ Iterable iterable = BaseIntegrationTest.service.query(criteria);
+ assertIterableContainsItems(iterable, true);
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartNoMatch() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(afterStore2);
+ Iterable iterable = BaseIntegrationTest.service.query(criteria);
+ assertFalse(iterable.iterator().hasNext());
+ }
+
+ @Test
+ public void testQueryUsingNameAndEnd() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOrdering(Ordering.ASCENDING);
+ criteria.setItemName(getItemName());
+ criteria.setEndDate(afterStore2);
+ Iterable iterable = BaseIntegrationTest.service.query(criteria);
+ assertIterableContainsItems(iterable, true);
+ }
+
+ @Test
+ public void testQueryUsingNameAndEndNoMatch() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setItemName(getItemName());
+ criteria.setEndDate(beforeStore);
+ Iterable iterable = BaseIntegrationTest.service.query(criteria);
+ assertFalse(iterable.iterator().hasNext());
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartAndEnd() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOrdering(Ordering.ASCENDING);
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ criteria.setEndDate(afterStore2);
+ Iterable iterable = BaseIntegrationTest.service.query(criteria);
+ assertIterableContainsItems(iterable, true);
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartAndEndDesc() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOrdering(Ordering.DESCENDING);
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ criteria.setEndDate(afterStore2);
+ Iterable iterable = BaseIntegrationTest.service.query(criteria);
+ assertIterableContainsItems(iterable, false);
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartAndEndWithNEQOperator() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOperator(Operator.NEQ);
+ criteria.setState(getSecondItemState());
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ criteria.setEndDate(afterStore2);
+ Iterable iterable = BaseIntegrationTest.service.query(criteria);
+ Iterator iterator = iterable.iterator();
+ HistoricItem actual1 = iterator.next();
+ assertFalse(iterator.hasNext());
+ assertStateEquals(getFirstItemState(), actual1.getState());
+ assertTrue(actual1.getTimestamp().before(afterStore1));
+ assertTrue(actual1.getTimestamp().after(beforeStore));
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartAndEndWithEQOperator() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOperator(Operator.EQ);
+ criteria.setState(getFirstItemState());
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ criteria.setEndDate(afterStore2);
+ Iterable iterable = BaseIntegrationTest.service.query(criteria);
+ Iterator iterator = iterable.iterator();
+ HistoricItem actual1 = iterator.next();
+ assertFalse(iterator.hasNext());
+ assertStateEquals(getFirstItemState(), actual1.getState());
+ assertTrue(actual1.getTimestamp().before(afterStore1));
+ assertTrue(actual1.getTimestamp().after(beforeStore));
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartAndEndWithLTOperator() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOperator(Operator.LT);
+ criteria.setState(getSecondItemState());
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ criteria.setEndDate(afterStore2);
+ Iterable iterable = BaseIntegrationTest.service.query(criteria);
+ Iterator iterator = iterable.iterator();
+ HistoricItem actual1 = iterator.next();
+ assertFalse(iterator.hasNext());
+ assertStateEquals(getFirstItemState(), actual1.getState());
+ assertTrue(actual1.getTimestamp().before(afterStore1));
+ assertTrue(actual1.getTimestamp().after(beforeStore));
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartAndEndWithLTOperatorNoMatch() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOperator(Operator.LT);
+ criteria.setState(getFirstItemState());
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ criteria.setEndDate(afterStore2);
+ Iterable iterable = BaseIntegrationTest.service.query(criteria);
+ Iterator iterator = iterable.iterator();
+ assertFalse(iterator.hasNext());
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartAndEndWithLTEOperator() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOperator(Operator.LTE);
+ criteria.setState(getFirstItemState());
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ criteria.setEndDate(afterStore2);
+ Iterable iterable = BaseIntegrationTest.service.query(criteria);
+ Iterator iterator = iterable.iterator();
+ HistoricItem actual1 = iterator.next();
+ assertFalse(iterator.hasNext());
+ assertStateEquals(getFirstItemState(), actual1.getState());
+ assertTrue(actual1.getTimestamp().before(afterStore1));
+ assertTrue(actual1.getTimestamp().after(beforeStore));
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartAndEndWithGTOperator() {
+ // Skip for subclasses which have null "state between"
+ Assume.assumeTrue(getQueryItemStateBetween() != null);
+
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOperator(Operator.GT);
+ criteria.setState(getQueryItemStateBetween());
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ criteria.setEndDate(afterStore2);
+ Iterable iterable = BaseIntegrationTest.service.query(criteria);
+ Iterator iterator = iterable.iterator();
+ HistoricItem actual1 = iterator.next();
+ assertFalse(iterator.hasNext());
+ assertStateEquals(getSecondItemState(), actual1.getState());
+ assertTrue(actual1.getTimestamp().before(afterStore2));
+ assertTrue(actual1.getTimestamp().after(afterStore1));
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartAndEndWithGTOperatorNoMatch() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOperator(Operator.GT);
+ criteria.setState(getSecondItemState());
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ criteria.setEndDate(afterStore2);
+ Iterable iterable = BaseIntegrationTest.service.query(criteria);
+ Iterator iterator = iterable.iterator();
+ assertFalse(iterator.hasNext());
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartAndEndWithGTEOperator() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOperator(Operator.GTE);
+ criteria.setState(getSecondItemState());
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ criteria.setEndDate(afterStore2);
+ Iterable iterable = BaseIntegrationTest.service.query(criteria);
+ Iterator iterator = iterable.iterator();
+ HistoricItem actual1 = iterator.next();
+ assertFalse(iterator.hasNext());
+ assertStateEquals(getSecondItemState(), actual1.getState());
+ assertTrue(actual1.getTimestamp().before(afterStore2));
+ assertTrue(actual1.getTimestamp().after(afterStore1));
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartAndEndFirst() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOrdering(Ordering.ASCENDING);
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ criteria.setEndDate(afterStore1);
+ Iterable iterable = BaseIntegrationTest.service.query(criteria);
+
+ Iterator iterator = iterable.iterator();
+ HistoricItem actual1 = iterator.next();
+ assertFalse(iterator.hasNext());
+ assertStateEquals(getFirstItemState(), actual1.getState());
+ assertTrue(actual1.getTimestamp().before(afterStore1));
+ assertTrue(actual1.getTimestamp().after(beforeStore));
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartAndEndNoMatch() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ criteria.setEndDate(beforeStore); // sic
+ Iterable iterable = BaseIntegrationTest.service.query(criteria);
+ assertFalse(iterable.iterator().hasNext());
+ }
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/BaseIntegrationTest.java b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/BaseIntegrationTest.java
new file mode 100644
index 0000000000000..8dec2cb2cadae
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/BaseIntegrationTest.java
@@ -0,0 +1,210 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.util.Collection;
+import java.util.HashMap;
+import java.util.Map;
+import java.util.Map.Entry;
+import java.util.stream.Stream;
+
+import org.apache.commons.lang.NotImplementedException;
+import org.eclipse.jdt.annotation.NonNull;
+import org.eclipse.jdt.annotation.Nullable;
+import org.junit.BeforeClass;
+import org.openhab.core.common.registry.RegistryChangeListener;
+import org.openhab.core.items.Item;
+import org.openhab.core.items.ItemNotFoundException;
+import org.openhab.core.items.ItemNotUniqueException;
+import org.openhab.core.items.ItemRegistry;
+import org.openhab.core.items.RegistryHook;
+import org.openhab.core.library.items.CallItem;
+import org.openhab.core.library.items.ColorItem;
+import org.openhab.core.library.items.ContactItem;
+import org.openhab.core.library.items.DateTimeItem;
+import org.openhab.core.library.items.DimmerItem;
+import org.openhab.core.library.items.LocationItem;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.core.library.items.RollershutterItem;
+import org.openhab.core.library.items.StringItem;
+import org.openhab.core.library.items.SwitchItem;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import com.amazonaws.services.dynamodbv2.model.ResourceNotFoundException;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+public class BaseIntegrationTest {
+ protected static final Logger logger = LoggerFactory.getLogger(DynamoDBPersistenceService.class);
+ protected static DynamoDBPersistenceService service;
+ protected final static Map items = new HashMap<>();
+
+ static {
+ System.setProperty("org.slf4j.simpleLogger.defaultLogLevel", "trace");
+ }
+
+ @BeforeClass
+ public static void initService() throws InterruptedException {
+ items.put("dimmer", new DimmerItem("dimmer"));
+ items.put("number", new NumberItem("number"));
+ items.put("string", new StringItem("string"));
+ items.put("switch", new SwitchItem("switch"));
+ items.put("contact", new ContactItem("contact"));
+ items.put("color", new ColorItem("color"));
+ items.put("rollershutter", new RollershutterItem("rollershutter"));
+ items.put("datetime", new DateTimeItem("datetime"));
+ items.put("call", new CallItem("call"));
+ items.put("location", new LocationItem("location"));
+
+ service = new DynamoDBPersistenceService();
+ service.setItemRegistry(new ItemRegistry() {
+
+ @Override
+ public Collection- getItems(String pattern) {
+ throw new NotImplementedException();
+ }
+
+ @Override
+ public Collection
- getItems() {
+ throw new NotImplementedException();
+ }
+
+ @Override
+ public Item getItemByPattern(String name) throws ItemNotFoundException, ItemNotUniqueException {
+ throw new NotImplementedException();
+ }
+
+ @Override
+ public Item getItem(String name) throws ItemNotFoundException {
+ Item item = items.get(name);
+ if (item == null) {
+ throw new ItemNotFoundException(name);
+ }
+ return item;
+ }
+
+ @Override
+ public void addRegistryChangeListener(RegistryChangeListener
- listener) {
+ throw new NotImplementedException();
+ }
+
+ @Override
+ public @NonNull Collection<@NonNull Item> getAll() {
+ throw new NotImplementedException();
+ }
+
+ @Override
+ public Stream
- stream() {
+ throw new NotImplementedException();
+ }
+
+ @Override
+ public @Nullable Item get(String key) {
+ throw new NotImplementedException();
+ }
+
+ @Override
+ public void removeRegistryChangeListener(RegistryChangeListener
- listener) {
+ throw new NotImplementedException();
+ }
+
+ @Override
+ public @NonNull Item add(@NonNull Item element) {
+ throw new NotImplementedException();
+ }
+
+ @Override
+ public @Nullable Item update(@NonNull Item element) {
+ throw new NotImplementedException();
+ }
+
+ @Override
+ public @Nullable Item remove(@NonNull String key) {
+ throw new NotImplementedException();
+ }
+
+ @Override
+ public Collection
- getItemsOfType(@NonNull String type) {
+ throw new NotImplementedException();
+ }
+
+ @Override
+ public Collection
- getItemsByTag(@NonNull String... tags) {
+ throw new NotImplementedException();
+ }
+
+ @Override
+ public Collection
- getItemsByTagAndType(@NonNull String type, @NonNull String... tags) {
+ throw new NotImplementedException();
+ }
+
+ @Override
+ public
Collection getItemsByTag(@NonNull Class typeFilter,
+ @NonNull String... tags) {
+ throw new NotImplementedException();
+ }
+
+ @Override
+ public @Nullable Item remove(@NonNull String itemName, boolean recursive) {
+ throw new NotImplementedException();
+ }
+
+ @Override
+ public void addRegistryHook(RegistryHook- hook) {
+ throw new NotImplementedException();
+ }
+
+ @Override
+ public void removeRegistryHook(RegistryHook
- hook) {
+ throw new NotImplementedException();
+ }
+ });
+
+ HashMap
config = new HashMap<>();
+ config.put("region", System.getProperty("DYNAMODBTEST_REGION"));
+ config.put("accessKey", System.getProperty("DYNAMODBTEST_ACCESS"));
+ config.put("secretKey", System.getProperty("DYNAMODBTEST_SECRET"));
+ config.put("tablePrefix", "dynamodb-integration-tests-");
+
+ // Disable buffering
+ config.put("bufferSize", "0");
+
+ for (Entry entry : config.entrySet()) {
+ if (entry.getValue() == null) {
+ logger.warn(String.format(
+ "Expecting %s to have value for integration tests. Integration tests will be skipped",
+ entry.getKey()));
+ service = null;
+ return;
+ }
+ }
+
+ service.activate(null, config);
+
+ // Clear data
+ for (String table : new String[] { "dynamodb-integration-tests-bigdecimal",
+ "dynamodb-integration-tests-string" }) {
+ try {
+ service.getDb().getDynamoClient().deleteTable(table);
+ service.getDb().getDynamoDB().getTable(table).waitForDelete();
+ } catch (ResourceNotFoundException e) {
+ }
+ }
+
+ }
+
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/CallItemIntegrationTest.java b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/CallItemIntegrationTest.java
new file mode 100644
index 0000000000000..407343a02d04d
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/CallItemIntegrationTest.java
@@ -0,0 +1,81 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import static org.junit.Assert.assertEquals;
+
+import java.util.Date;
+
+import org.junit.BeforeClass;
+import org.openhab.core.library.items.CallItem;
+import org.openhab.core.library.types.StringListType;
+import org.openhab.core.types.State;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+public class CallItemIntegrationTest extends AbstractTwoItemIntegrationTest {
+
+ private static final String name = "call";
+ // values are encoded as dest##orig, ordering goes wrt strings
+ private static final StringListType state1 = new StringListType("orig1", "dest1");
+ private static final StringListType state2 = new StringListType("orig1", "dest3");
+ private static final StringListType stateBetween = new StringListType("orig2", "dest2");
+
+ @BeforeClass
+ public static void storeData() throws InterruptedException {
+ CallItem item = (CallItem) items.get(name);
+ item.setState(state1);
+ beforeStore = new Date();
+ Thread.sleep(10);
+ service.store(item);
+ afterStore1 = new Date();
+ Thread.sleep(10);
+ item.setState(state2);
+ service.store(item);
+ Thread.sleep(10);
+ afterStore2 = new Date();
+
+ logger.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore),
+ AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1));
+ }
+
+ @Override
+ protected String getItemName() {
+ return name;
+ }
+
+ @Override
+ protected State getFirstItemState() {
+ return state1;
+ }
+
+ @Override
+ protected State getSecondItemState() {
+ return state2;
+ }
+
+ @Override
+ protected State getQueryItemStateBetween() {
+ return stateBetween;
+ }
+
+ @Override
+ protected void assertStateEquals(State expected, State actual) {
+ // Since CallType.equals is broken, toString is used as workaround
+ assertEquals(expected.toString(), actual.toString());
+ }
+
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/ColorItemIntegrationTest.java b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/ColorItemIntegrationTest.java
new file mode 100644
index 0000000000000..950b254102c89
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/ColorItemIntegrationTest.java
@@ -0,0 +1,86 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.math.BigDecimal;
+import java.util.Date;
+
+import org.junit.BeforeClass;
+import org.openhab.core.library.items.ColorItem;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.library.types.HSBType;
+import org.openhab.core.library.types.PercentType;
+import org.openhab.core.types.State;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+public class ColorItemIntegrationTest extends AbstractTwoItemIntegrationTest {
+
+ private static HSBType color(double hue, int saturation, int brightness) {
+ return new HSBType(new DecimalType(hue), new PercentType(saturation), new PercentType(brightness));
+ }
+
+ private static HSBType color(String hue, int saturation, int brightness) {
+ return new HSBType(new DecimalType(new BigDecimal(hue)), new PercentType(saturation),
+ new PercentType(brightness));
+ }
+
+ private static final String name = "color";
+ // values are encoded as ,,, ordering goes wrt strings
+ private static final HSBType state1 = color("3.1493842988948932984298384892384823984923849238492839483294893", 50,
+ 50);
+ private static final HSBType state2 = color(75, 100, 90);
+ private static final HSBType stateBetween = color(60, 50, 50);
+
+ @BeforeClass
+ public static void storeData() throws InterruptedException {
+ ColorItem item = (ColorItem) items.get(name);
+ item.setState(state1);
+ beforeStore = new Date();
+ Thread.sleep(10);
+ service.store(item);
+ afterStore1 = new Date();
+ Thread.sleep(10);
+ item.setState(state2);
+ service.store(item);
+ Thread.sleep(10);
+ afterStore2 = new Date();
+
+ logger.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore),
+ AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1));
+ }
+
+ @Override
+ protected String getItemName() {
+ return name;
+ }
+
+ @Override
+ protected State getFirstItemState() {
+ return state1;
+ }
+
+ @Override
+ protected State getSecondItemState() {
+ return state2;
+ }
+
+ @Override
+ protected State getQueryItemStateBetween() {
+ return stateBetween;
+ }
+
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/ContactItemIntegrationTest.java b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/ContactItemIntegrationTest.java
new file mode 100644
index 0000000000000..f332c95f8a432
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/ContactItemIntegrationTest.java
@@ -0,0 +1,75 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.util.Date;
+
+import org.junit.BeforeClass;
+import org.openhab.core.library.items.ContactItem;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.library.types.OpenClosedType;
+import org.openhab.core.types.State;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+public class ContactItemIntegrationTest extends AbstractTwoItemIntegrationTest {
+
+ private static final String name = "contact";
+ private static final OpenClosedType state1 = OpenClosedType.CLOSED;
+ private static final OpenClosedType state2 = OpenClosedType.OPEN;
+ // There is no OpenClosedType state value between CLOSED and OPEN.
+ // Omit extended query tests AbstractTwoItemIntegrationTest by setting stateBetween to null.
+ private static final OnOffType stateBetween = null;
+
+ @BeforeClass
+ public static void storeData() throws InterruptedException {
+ ContactItem item = (ContactItem) items.get(name);
+ item.setState(state1);
+ beforeStore = new Date();
+ Thread.sleep(10);
+ service.store(item);
+ afterStore1 = new Date();
+ Thread.sleep(10);
+ item.setState(state2);
+ service.store(item);
+ Thread.sleep(10);
+ afterStore2 = new Date();
+
+ logger.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore),
+ AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1));
+ }
+
+ @Override
+ protected String getItemName() {
+ return name;
+ }
+
+ @Override
+ protected State getFirstItemState() {
+ return state1;
+ }
+
+ @Override
+ protected State getSecondItemState() {
+ return state2;
+ }
+
+ @Override
+ protected State getQueryItemStateBetween() {
+ return stateBetween;
+ }
+
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/DateTimeItemIntegrationTest.java b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/DateTimeItemIntegrationTest.java
new file mode 100644
index 0000000000000..10b193c65e453
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/DateTimeItemIntegrationTest.java
@@ -0,0 +1,86 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.util.Calendar;
+import java.util.Date;
+import java.util.TimeZone;
+
+import org.junit.BeforeClass;
+import org.openhab.core.library.items.DateTimeItem;
+import org.openhab.core.library.types.DateTimeType;
+import org.openhab.core.types.State;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+public class DateTimeItemIntegrationTest extends AbstractTwoItemIntegrationTest {
+
+ private static final String name = "datetime";
+ private static final Calendar cal1 = Calendar.getInstance(TimeZone.getTimeZone("UTC"));
+ private static final Calendar cal2 = Calendar.getInstance(TimeZone.getTimeZone("UTC"));
+ private static final Calendar calBetween = Calendar.getInstance(TimeZone.getTimeZone("UTC"));
+ static {
+ cal1.set(2016, 5, 15, 10, 00, 00);
+ cal2.set(2016, 5, 15, 16, 00, 00);
+ cal2.set(Calendar.MILLISECOND, 123);
+ calBetween.set(2016, 5, 15, 14, 00, 00);
+ }
+
+ private static final DateTimeType state1 = new DateTimeType(cal1);
+ private static final DateTimeType state2 = new DateTimeType(cal2);
+ private static final DateTimeType stateBetween = new DateTimeType(calBetween);;
+
+ @BeforeClass
+ public static void storeData() throws InterruptedException {
+ DateTimeItem item = (DateTimeItem) items.get(name);
+
+ item.setState(state1);
+
+ beforeStore = new Date();
+ Thread.sleep(10);
+ service.store(item);
+ afterStore1 = new Date();
+ Thread.sleep(10);
+ item.setState(state2);
+ service.store(item);
+ Thread.sleep(10);
+ afterStore2 = new Date();
+
+ logger.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore),
+ AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1));
+ }
+
+ @Override
+ protected String getItemName() {
+ return name;
+ }
+
+ @Override
+ protected State getFirstItemState() {
+ return state1;
+ }
+
+ @Override
+ protected State getSecondItemState() {
+ return state2;
+ }
+
+ @Override
+ protected State getQueryItemStateBetween() {
+ return stateBetween;
+ }
+
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/DimmerItemIntegrationTest.java b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/DimmerItemIntegrationTest.java
new file mode 100644
index 0000000000000..de2040f53829b
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/DimmerItemIntegrationTest.java
@@ -0,0 +1,74 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.util.Date;
+
+import org.junit.BeforeClass;
+import org.openhab.core.library.items.DimmerItem;
+import org.openhab.core.library.types.PercentType;
+import org.openhab.core.types.State;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+public class DimmerItemIntegrationTest extends AbstractTwoItemIntegrationTest {
+
+ private static final String name = "dimmer";
+ private static final PercentType state1 = new PercentType(66);
+ private static final PercentType state2 = new PercentType(68);
+ private static final PercentType stateBetween = new PercentType(67);
+
+ @BeforeClass
+ public static void storeData() throws InterruptedException {
+ DimmerItem item = (DimmerItem) items.get(name);
+
+ item.setState(state1);
+
+ beforeStore = new Date();
+ Thread.sleep(10);
+ service.store(item);
+ afterStore1 = new Date();
+ Thread.sleep(10);
+ item.setState(state2);
+ service.store(item);
+ Thread.sleep(10);
+ afterStore2 = new Date();
+
+ logger.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore),
+ AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1));
+ }
+
+ @Override
+ protected String getItemName() {
+ return name;
+ }
+
+ @Override
+ protected State getFirstItemState() {
+ return state1;
+ }
+
+ @Override
+ protected State getSecondItemState() {
+ return state2;
+ }
+
+ @Override
+ protected State getQueryItemStateBetween() {
+ return stateBetween;
+ }
+
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/DynamoDBConfigTest.java b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/DynamoDBConfigTest.java
new file mode 100644
index 0000000000000..1f0a0ff600c9a
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/DynamoDBConfigTest.java
@@ -0,0 +1,217 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import static org.junit.Assert.*;
+
+import java.io.File;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.Map;
+
+import org.apache.commons.io.FileUtils;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.rules.TemporaryFolder;
+
+import com.amazonaws.regions.Regions;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+public class DynamoDBConfigTest {
+
+ private static Map mapFrom(String... args) {
+ assert args.length % 2 == 0;
+ Map config = new HashMap<>();
+ for (int i = 1; i < args.length; i++) {
+ String key = args[i - 1];
+ String val = args[i];
+ config.put(key, val);
+ }
+ return Collections.unmodifiableMap(config);
+ }
+
+ @Rule
+ public TemporaryFolder folder = new TemporaryFolder();
+
+ @Test
+ public void testEmpty() throws Exception {
+ assertNull(DynamoDBConfig.fromConfig(new HashMap()));
+ }
+
+ @Test
+ public void testInvalidRegion() throws Exception {
+ assertNull(DynamoDBConfig.fromConfig(Collections.singletonMap("region", "foobie")));
+ }
+
+ @Test
+ public void testRegionOnly() throws Exception {
+ assertNull(DynamoDBConfig.fromConfig(Collections.singletonMap("region", "eu-west-1")));
+ }
+
+ @Test
+ public void testRegionWithAccessKeys() throws Exception {
+ DynamoDBConfig fromConfig = DynamoDBConfig
+ .fromConfig(mapFrom("region", "eu-west-1", "accessKey", "access1", "secretKey", "secret1"));
+ assertEquals(Regions.EU_WEST_1, fromConfig.getRegion());
+ assertEquals("access1", fromConfig.getCredentials().getAWSAccessKeyId());
+ assertEquals("secret1", fromConfig.getCredentials().getAWSSecretKey());
+ assertEquals("openhab-", fromConfig.getTablePrefix());
+ assertEquals(true, fromConfig.isCreateTable());
+ assertEquals(1, fromConfig.getReadCapacityUnits());
+ assertEquals(1, fromConfig.getWriteCapacityUnits());
+ assertEquals(1000L, fromConfig.getBufferCommitIntervalMillis());
+ assertEquals(1000, fromConfig.getBufferSize());
+ }
+
+ @Test
+ public void testRegionWithProfilesConfigFile() throws Exception {
+ File credsFile = folder.newFile("creds");
+ FileUtils.write(credsFile, "[fooprofile]\n" + "aws_access_key_id=testAccessKey\n"
+ + "aws_secret_access_key=testSecretKey\n" + "aws_session_token=testSessionToken\n");
+
+ DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(mapFrom("region", "eu-west-1", "profilesConfigFile",
+ credsFile.getAbsolutePath(), "profile", "fooprofile"));
+ assertEquals(Regions.EU_WEST_1, fromConfig.getRegion());
+ assertEquals("openhab-", fromConfig.getTablePrefix());
+ assertEquals(true, fromConfig.isCreateTable());
+ assertEquals(1, fromConfig.getReadCapacityUnits());
+ assertEquals(1, fromConfig.getWriteCapacityUnits());
+ assertEquals(1000L, fromConfig.getBufferCommitIntervalMillis());
+ assertEquals(1000, fromConfig.getBufferSize());
+ }
+
+ @Test
+ public void testNullConfiguration() throws Exception {
+ assertNull(DynamoDBConfig.fromConfig(null));
+ }
+
+ @Test
+ public void testEmptyConfiguration() throws Exception {
+ assertNull(DynamoDBConfig.fromConfig(mapFrom()));
+ }
+
+ @Test
+ public void testRegionWithInvalidProfilesConfigFile() throws Exception {
+ File credsFile = folder.newFile("creds");
+ FileUtils.write(credsFile, "[fooprofile]\n" + "aws_access_key_idINVALIDKEY=testAccessKey\n"
+ + "aws_secret_access_key=testSecretKey\n" + "aws_session_token=testSessionToken\n");
+
+ assertNull(DynamoDBConfig.fromConfig(mapFrom("region", "eu-west-1", "profilesConfigFile",
+ credsFile.getAbsolutePath(), "profile", "fooprofile")));
+ }
+
+ @Test
+ public void testRegionWithProfilesConfigFileMissingProfile() throws Exception {
+ File credsFile = folder.newFile("creds");
+ FileUtils.write(credsFile, "[fooprofile]\n" + "aws_access_key_id=testAccessKey\n"
+ + "aws_secret_access_key=testSecretKey\n" + "aws_session_token=testSessionToken\n");
+
+ assertNull(DynamoDBConfig
+ .fromConfig(mapFrom("region", "eu-west-1", "profilesConfigFile", credsFile.getAbsolutePath())));
+ }
+
+ @Test
+ public void testRegionWithAccessKeysWithPrefix() throws Exception {
+ DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(mapFrom("region", "eu-west-1", "accessKey", "access1",
+ "secretKey", "secret1", "tablePrefix", "foobie-"));
+ assertEquals(Regions.EU_WEST_1, fromConfig.getRegion());
+ assertEquals("access1", fromConfig.getCredentials().getAWSAccessKeyId());
+ assertEquals("secret1", fromConfig.getCredentials().getAWSSecretKey());
+ assertEquals("foobie-", fromConfig.getTablePrefix());
+ assertEquals(true, fromConfig.isCreateTable());
+ assertEquals(1, fromConfig.getReadCapacityUnits());
+ assertEquals(1, fromConfig.getWriteCapacityUnits());
+ assertEquals(1000L, fromConfig.getBufferCommitIntervalMillis());
+ assertEquals(1000, fromConfig.getBufferSize());
+ }
+
+ @Test
+ public void testRegionWithAccessKeysWithPrefixWithCreateTable() throws Exception {
+ DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(
+ mapFrom("region", "eu-west-1", "accessKey", "access1", "secretKey", "secret1", "createTable", "false"));
+ assertEquals(Regions.EU_WEST_1, fromConfig.getRegion());
+ assertEquals("access1", fromConfig.getCredentials().getAWSAccessKeyId());
+ assertEquals("secret1", fromConfig.getCredentials().getAWSSecretKey());
+ assertEquals("openhab-", fromConfig.getTablePrefix());
+ assertEquals(false, fromConfig.isCreateTable());
+ assertEquals(1, fromConfig.getReadCapacityUnits());
+ assertEquals(1, fromConfig.getWriteCapacityUnits());
+ assertEquals(1000L, fromConfig.getBufferCommitIntervalMillis());
+ assertEquals(1000, fromConfig.getBufferSize());
+ }
+
+ @Test
+ public void testRegionWithAccessKeysWithPrefixWithReadCapacityUnits() throws Exception {
+ DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(mapFrom("region", "eu-west-1", "accessKey", "access1",
+ "secretKey", "secret1", "readCapacityUnits", "5"));
+ assertEquals(Regions.EU_WEST_1, fromConfig.getRegion());
+ assertEquals("access1", fromConfig.getCredentials().getAWSAccessKeyId());
+ assertEquals("secret1", fromConfig.getCredentials().getAWSSecretKey());
+ assertEquals("openhab-", fromConfig.getTablePrefix());
+ assertEquals(true, fromConfig.isCreateTable());
+ assertEquals(5, fromConfig.getReadCapacityUnits());
+ assertEquals(1, fromConfig.getWriteCapacityUnits());
+ assertEquals(1000L, fromConfig.getBufferCommitIntervalMillis());
+ assertEquals(1000, fromConfig.getBufferSize());
+ }
+
+ @Test
+ public void testRegionWithAccessKeysWithPrefixWithWriteCapacityUnits() throws Exception {
+ DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(mapFrom("region", "eu-west-1", "accessKey", "access1",
+ "secretKey", "secret1", "writeCapacityUnits", "5"));
+ assertEquals(Regions.EU_WEST_1, fromConfig.getRegion());
+ assertEquals("access1", fromConfig.getCredentials().getAWSAccessKeyId());
+ assertEquals("secret1", fromConfig.getCredentials().getAWSSecretKey());
+ assertEquals("openhab-", fromConfig.getTablePrefix());
+ assertEquals(true, fromConfig.isCreateTable());
+ assertEquals(1, fromConfig.getReadCapacityUnits());
+ assertEquals(5, fromConfig.getWriteCapacityUnits());
+ assertEquals(1000L, fromConfig.getBufferCommitIntervalMillis());
+ assertEquals(1000, fromConfig.getBufferSize());
+ }
+
+ @Test
+ public void testRegionWithAccessKeysWithPrefixWithReadWriteCapacityUnits() throws Exception {
+ DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(mapFrom("region", "eu-west-1", "accessKey", "access1",
+ "secretKey", "secret1", "readCapacityUnits", "3", "writeCapacityUnits", "5"));
+ assertEquals(Regions.EU_WEST_1, fromConfig.getRegion());
+ assertEquals("access1", fromConfig.getCredentials().getAWSAccessKeyId());
+ assertEquals("secret1", fromConfig.getCredentials().getAWSSecretKey());
+ assertEquals("openhab-", fromConfig.getTablePrefix());
+ assertEquals(true, fromConfig.isCreateTable());
+ assertEquals(3, fromConfig.getReadCapacityUnits());
+ assertEquals(5, fromConfig.getWriteCapacityUnits());
+ assertEquals(1000L, fromConfig.getBufferCommitIntervalMillis());
+ assertEquals(1000, fromConfig.getBufferSize());
+ }
+
+ @Test
+ public void testRegionWithAccessKeysWithPrefixWithReadWriteCapacityUnitsWithBufferSettings() throws Exception {
+ DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(
+ mapFrom("region", "eu-west-1", "accessKey", "access1", "secretKey", "secret1", "readCapacityUnits", "3",
+ "writeCapacityUnits", "5", "bufferCommitIntervalMillis", "501", "bufferSize", "112"));
+ assertEquals(Regions.EU_WEST_1, fromConfig.getRegion());
+ assertEquals("access1", fromConfig.getCredentials().getAWSAccessKeyId());
+ assertEquals("secret1", fromConfig.getCredentials().getAWSSecretKey());
+ assertEquals("openhab-", fromConfig.getTablePrefix());
+ assertEquals(true, fromConfig.isCreateTable());
+ assertEquals(3, fromConfig.getReadCapacityUnits());
+ assertEquals(5, fromConfig.getWriteCapacityUnits());
+ assertEquals(501L, fromConfig.getBufferCommitIntervalMillis());
+ assertEquals(112, fromConfig.getBufferSize());
+ }
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/DynamoDBTableNameResolverTest.java b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/DynamoDBTableNameResolverTest.java
new file mode 100644
index 0000000000000..fb438c0b3f418
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/DynamoDBTableNameResolverTest.java
@@ -0,0 +1,36 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import static org.junit.Assert.assertEquals;
+
+import org.junit.Test;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+public class DynamoDBTableNameResolverTest {
+
+ @Test
+ public void testWithDynamoDBBigDecimalItem() {
+ assertEquals("prefixbigdecimal",
+ new DynamoDBTableNameResolver("prefix").fromItem(new DynamoDBBigDecimalItem()));
+ }
+
+ @Test
+ public void testWithDynamoDBStringItem() {
+ assertEquals("prefixstring", new DynamoDBTableNameResolver("prefix").fromItem(new DynamoDBStringItem()));
+ }
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/LocationItemIntegrationTest.java b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/LocationItemIntegrationTest.java
new file mode 100644
index 0000000000000..34690bf611096
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/LocationItemIntegrationTest.java
@@ -0,0 +1,76 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.util.Date;
+
+import org.junit.BeforeClass;
+import org.openhab.core.library.items.LocationItem;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.library.types.PointType;
+import org.openhab.core.types.State;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+public class LocationItemIntegrationTest extends AbstractTwoItemIntegrationTest {
+
+ private static final String name = "location";
+ // values are encoded as lat,lon[,alt] , ordering goes wrt strings
+ private static final PointType state1 = new PointType(
+ new DecimalType("60.012033100120453345435345345345346365434630300230230032020393149"), new DecimalType(30.),
+ new DecimalType(3.0));
+ private static final PointType state2 = new PointType(new DecimalType(61.0), new DecimalType(30.));
+ private static final PointType stateBetween = new PointType(new DecimalType(60.5), new DecimalType(30.));
+
+ @BeforeClass
+ public static void storeData() throws InterruptedException {
+ LocationItem item = (LocationItem) items.get(name);
+ item.setState(state1);
+ beforeStore = new Date();
+ Thread.sleep(10);
+ service.store(item);
+ afterStore1 = new Date();
+ Thread.sleep(10);
+ item.setState(state2);
+ service.store(item);
+ Thread.sleep(10);
+ afterStore2 = new Date();
+
+ logger.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore),
+ AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1));
+ }
+
+ @Override
+ protected String getItemName() {
+ return name;
+ }
+
+ @Override
+ protected State getFirstItemState() {
+ return state1;
+ }
+
+ @Override
+ protected State getSecondItemState() {
+ return state2;
+ }
+
+ @Override
+ protected State getQueryItemStateBetween() {
+ return stateBetween;
+ }
+
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/NumberItemIntegrationTest.java b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/NumberItemIntegrationTest.java
new file mode 100644
index 0000000000000..9b908984f50be
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/NumberItemIntegrationTest.java
@@ -0,0 +1,89 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import static org.junit.Assert.assertTrue;
+
+import java.math.BigDecimal;
+import java.util.Date;
+
+import org.junit.BeforeClass;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.types.State;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+public class NumberItemIntegrationTest extends AbstractTwoItemIntegrationTest {
+
+ private static final String name = "number";
+ // On purpose we have super accurate number here (testing limits of aws)
+ private static final DecimalType state1 = new DecimalType(new BigDecimal(
+ "-32343243.193490838904389298049802398048923849032809483209483209482309840239840932840932849083094809483"));
+ private static final DecimalType state2 = new DecimalType(600.9123);
+ private static final DecimalType stateBetween = new DecimalType(500);
+
+ @BeforeClass
+ public static void storeData() throws InterruptedException {
+ NumberItem item = (NumberItem) items.get(name);
+
+ item.setState(state1);
+
+ beforeStore = new Date();
+ Thread.sleep(10);
+ service.store(item);
+ afterStore1 = new Date();
+ Thread.sleep(10);
+ item.setState(state2);
+ service.store(item);
+ Thread.sleep(10);
+ afterStore2 = new Date();
+
+ logger.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore),
+ AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1));
+ }
+
+ @Override
+ protected String getItemName() {
+ return name;
+ }
+
+ @Override
+ protected State getFirstItemState() {
+ return state1;
+ }
+
+ @Override
+ protected State getSecondItemState() {
+ return state2;
+ }
+
+ @Override
+ protected State getQueryItemStateBetween() {
+ return stateBetween;
+ }
+
+ /**
+ * Use relaxed state comparison due to numerical rounding. See also DynamoDBBigDecimalItem.loseDigits
+ */
+ @Override
+ protected void assertStateEquals(State expected, State actual) {
+ BigDecimal expectedDecimal = ((DecimalType) expected).toBigDecimal();
+ BigDecimal actualDecimal = ((DecimalType) actual).toBigDecimal();
+ assertTrue(DynamoDBBigDecimalItem.loseDigits(expectedDecimal).compareTo(actualDecimal) == 0);
+ }
+
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/RollershutterItemIntegrationTest.java b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/RollershutterItemIntegrationTest.java
new file mode 100644
index 0000000000000..4923991eddf27
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/RollershutterItemIntegrationTest.java
@@ -0,0 +1,85 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import static org.junit.Assert.assertTrue;
+
+import java.math.BigDecimal;
+import java.util.Date;
+
+import org.junit.BeforeClass;
+import org.openhab.core.library.items.RollershutterItem;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.library.types.PercentType;
+import org.openhab.core.types.State;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+public class RollershutterItemIntegrationTest extends AbstractTwoItemIntegrationTest {
+
+ private static final String name = "rollershutter";
+ private static final PercentType state1 = PercentType.ZERO;
+ private static final PercentType state2 = new PercentType("72.938289428989489389329834898929892439842399483498");
+ private static final PercentType stateBetween = new PercentType(66); // no such that exists
+
+ @BeforeClass
+ public static void storeData() throws InterruptedException {
+ RollershutterItem item = (RollershutterItem) items.get(name);
+ item.setState(state1);
+ beforeStore = new Date();
+ Thread.sleep(10);
+ service.store(item);
+ afterStore1 = new Date();
+ Thread.sleep(10);
+ item.setState(state2);
+ service.store(item);
+ Thread.sleep(10);
+ afterStore2 = new Date();
+
+ logger.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore),
+ AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1));
+ }
+
+ @Override
+ protected String getItemName() {
+ return name;
+ }
+
+ @Override
+ protected State getFirstItemState() {
+ return state1;
+ }
+
+ @Override
+ protected State getSecondItemState() {
+ return state2;
+ }
+
+ @Override
+ protected State getQueryItemStateBetween() {
+ return stateBetween;
+ }
+
+ /**
+ * Use relaxed state comparison due to numerical rounding. See also DynamoDBBigDecimalItem.loseDigits
+ */
+ @Override
+ protected void assertStateEquals(State expected, State actual) {
+ BigDecimal expectedDecimal = ((DecimalType) expected).toBigDecimal();
+ BigDecimal actualDecimal = ((DecimalType) actual).toBigDecimal();
+ assertTrue(DynamoDBBigDecimalItem.loseDigits(expectedDecimal).compareTo(actualDecimal) == 0);
+ }
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/StringItemIntegrationTest.java b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/StringItemIntegrationTest.java
new file mode 100644
index 0000000000000..c6d52c0838722
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/StringItemIntegrationTest.java
@@ -0,0 +1,72 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.util.Date;
+
+import org.junit.BeforeClass;
+import org.openhab.core.library.items.StringItem;
+import org.openhab.core.library.types.StringType;
+import org.openhab.core.types.State;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+public class StringItemIntegrationTest extends AbstractTwoItemIntegrationTest {
+
+ private static final String name = "string";
+ private static final StringType state1 = new StringType("b001");
+ private static final StringType state2 = new StringType("c002");
+ private static final StringType stateBetween = new StringType("b001");
+
+ @BeforeClass
+ public static void storeData() throws InterruptedException {
+ StringItem item = (StringItem) items.get(name);
+ item.setState(state1);
+ beforeStore = new Date();
+ Thread.sleep(10);
+ service.store(item);
+ afterStore1 = new Date();
+ Thread.sleep(10);
+ item.setState(state2);
+ service.store(item);
+ Thread.sleep(10);
+ afterStore2 = new Date();
+
+ logger.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore),
+ AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1));
+ }
+
+ @Override
+ protected String getItemName() {
+ return name;
+ }
+
+ @Override
+ protected State getFirstItemState() {
+ return state1;
+ }
+
+ @Override
+ protected State getSecondItemState() {
+ return state2;
+ }
+
+ @Override
+ protected State getQueryItemStateBetween() {
+ return stateBetween;
+ }
+
+}
diff --git a/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/SwitchItemIntegrationTest.java b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/SwitchItemIntegrationTest.java
new file mode 100644
index 0000000000000..966c5256b4513
--- /dev/null
+++ b/bundles/org.openhab.persistence.dynamodb/src/test/java/org/openhab/persistence/dynamodb/internal/SwitchItemIntegrationTest.java
@@ -0,0 +1,74 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.util.Date;
+
+import org.junit.BeforeClass;
+import org.openhab.core.library.items.SwitchItem;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.types.State;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+public class SwitchItemIntegrationTest extends AbstractTwoItemIntegrationTest {
+
+ private static final String name = "switch";
+ private static final OnOffType state1 = OnOffType.OFF;
+ private static final OnOffType state2 = OnOffType.ON;
+ // There is no OnOffType state value between OFF and ON.
+ // Omit extended query tests AbstractTwoItemIntegrationTest by setting stateBetween to null.
+ private static final OnOffType stateBetween = null;
+
+ @BeforeClass
+ public static void storeData() throws InterruptedException {
+ SwitchItem item = (SwitchItem) items.get(name);
+ item.setState(state1);
+ beforeStore = new Date();
+ Thread.sleep(10);
+ service.store(item);
+ afterStore1 = new Date();
+ Thread.sleep(10);
+ item.setState(state2);
+ service.store(item);
+ Thread.sleep(10);
+ afterStore2 = new Date();
+
+ logger.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore),
+ AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1));
+ }
+
+ @Override
+ protected String getItemName() {
+ return name;
+ }
+
+ @Override
+ protected State getFirstItemState() {
+ return state1;
+ }
+
+ @Override
+ protected State getSecondItemState() {
+ return state2;
+ }
+
+ @Override
+ protected State getQueryItemStateBetween() {
+ return stateBetween;
+ }
+
+}
diff --git a/bundles/org.openhab.persistence.influxdb/.classpath b/bundles/org.openhab.persistence.influxdb/.classpath
new file mode 100644
index 0000000000000..19368e503c1f5
--- /dev/null
+++ b/bundles/org.openhab.persistence.influxdb/.classpath
@@ -0,0 +1,27 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/bundles/org.openhab.persistence.influxdb/.project b/bundles/org.openhab.persistence.influxdb/.project
new file mode 100644
index 0000000000000..bf2c7afe2fada
--- /dev/null
+++ b/bundles/org.openhab.persistence.influxdb/.project
@@ -0,0 +1,23 @@
+
+
+ org.openhab.persistence.influxdb
+
+
+
+
+
+ org.eclipse.jdt.core.javabuilder
+
+
+
+
+ org.eclipse.m2e.core.maven2Builder
+
+
+
+
+
+ org.eclipse.jdt.core.javanature
+ org.eclipse.m2e.core.maven2Nature
+
+
diff --git a/bundles/org.openhab.persistence.influxdb/.settings/org.eclipse.core.resources.prefs b/bundles/org.openhab.persistence.influxdb/.settings/org.eclipse.core.resources.prefs
new file mode 100644
index 0000000000000..e9441bb123ec3
--- /dev/null
+++ b/bundles/org.openhab.persistence.influxdb/.settings/org.eclipse.core.resources.prefs
@@ -0,0 +1,3 @@
+eclipse.preferences.version=1
+encoding//src/main/java=UTF-8
+encoding/=UTF-8
diff --git a/bundles/org.openhab.persistence.influxdb/.settings/org.eclipse.jdt.core.prefs b/bundles/org.openhab.persistence.influxdb/.settings/org.eclipse.jdt.core.prefs
new file mode 100644
index 0000000000000..29fe717b8794b
--- /dev/null
+++ b/bundles/org.openhab.persistence.influxdb/.settings/org.eclipse.jdt.core.prefs
@@ -0,0 +1,11 @@
+eclipse.preferences.version=1
+org.eclipse.jdt.core.compiler.codegen.inlineJsrBytecode=enabled
+org.eclipse.jdt.core.compiler.codegen.targetPlatform=11
+org.eclipse.jdt.core.compiler.compliance=11
+org.eclipse.jdt.core.compiler.problem.assertIdentifier=error
+org.eclipse.jdt.core.compiler.problem.enablePreviewFeatures=disabled
+org.eclipse.jdt.core.compiler.problem.enumIdentifier=error
+org.eclipse.jdt.core.compiler.problem.forbiddenReference=warning
+org.eclipse.jdt.core.compiler.problem.reportPreviewFeatures=warning
+org.eclipse.jdt.core.compiler.release=disabled
+org.eclipse.jdt.core.compiler.source=11
diff --git a/bundles/org.openhab.persistence.influxdb/.settings/org.eclipse.jdt.ui.prefs b/bundles/org.openhab.persistence.influxdb/.settings/org.eclipse.jdt.ui.prefs
new file mode 100644
index 0000000000000..fe89f28bca590
--- /dev/null
+++ b/bundles/org.openhab.persistence.influxdb/.settings/org.eclipse.jdt.ui.prefs
@@ -0,0 +1,2 @@
+eclipse.preferences.version=1
+formatter_settings_version=12
diff --git a/bundles/org.openhab.persistence.influxdb/.settings/org.eclipse.m2e.core.prefs b/bundles/org.openhab.persistence.influxdb/.settings/org.eclipse.m2e.core.prefs
new file mode 100644
index 0000000000000..f897a7f1cb238
--- /dev/null
+++ b/bundles/org.openhab.persistence.influxdb/.settings/org.eclipse.m2e.core.prefs
@@ -0,0 +1,4 @@
+activeProfiles=
+eclipse.preferences.version=1
+resolveWorkspaceProjects=true
+version=1
diff --git a/bundles/org.openhab.persistence.influxdb/.settings/org.eclipse.pde.core.prefs b/bundles/org.openhab.persistence.influxdb/.settings/org.eclipse.pde.core.prefs
new file mode 100644
index 0000000000000..f29e940a0059c
--- /dev/null
+++ b/bundles/org.openhab.persistence.influxdb/.settings/org.eclipse.pde.core.prefs
@@ -0,0 +1,3 @@
+eclipse.preferences.version=1
+pluginProject.extensions=false
+resolve.requirebundle=false
diff --git a/bundles/org.openhab.persistence.influxdb/NOTICE b/bundles/org.openhab.persistence.influxdb/NOTICE
new file mode 100644
index 0000000000000..6c17d0d8a455b
--- /dev/null
+++ b/bundles/org.openhab.persistence.influxdb/NOTICE
@@ -0,0 +1,14 @@
+This content is produced and maintained by the openHAB project.
+
+* Project home: https://www.openhab.org
+
+== Declared Project Licenses
+
+This program and the accompanying materials are made available under the terms
+of the Eclipse Public License 2.0 which is available at
+https://www.eclipse.org/legal/epl-2.0/.
+
+== Source Code
+
+https://github.com/openhab/openhab-core
+
diff --git a/bundles/org.openhab.persistence.influxdb/README.md b/bundles/org.openhab.persistence.influxdb/README.md
new file mode 100644
index 0000000000000..470e4d6d0bb15
--- /dev/null
+++ b/bundles/org.openhab.persistence.influxdb/README.md
@@ -0,0 +1,35 @@
+# InfluxDB (0.9 and newer) Persistence
+
+This service allows you to persist and query states using the [InfluxDB](http://influxdb.org) time series database. The persisted values can be queried from within openHAB. There also are nice tools on the web for visualizing InfluxDB time series, such as [Grafana](http://grafana.org/).
+
+> There are two Influxdb persistence bundles which support different InfluxDB versions. This service, named `influxdb`, supports InfluxDB 0.9 and newer, and the `influxdb08` service supports InfluxDB up to version 0.8.x.
+
+## Database Structure
+
+The states of an item are persisted in time series with names equal to the name of the item. All values are stored in a field called "value" using integers or doubles, `OnOffType` and `OpenClosedType` values are stored using 0 or 1. The times for the entries are calculated by InfluxDB.
+
+An example entry for an item with the name "AmbientLight" would look like this:
+
+|time | sequence_number| value|
+|-----|-----------------|-------|
+|1402243200072 | 79370001 | 6|
+
+## Prerequisites
+
+First of all you have to setup and run an InfluxDB server. This is very easy and you will find good documentation on it on the [InfluxDB web site](https://docs.influxdata.com/influxdb/v1.7/).
+
+Then database and the user must be created. This can be done using the InfluxDB admin web interface. If you want to use the defaults, then create a database called `openhab` and a user with write access on the database called `openhab`. Choose a password and remember it.
+
+## Configuration
+
+This service can be configured in the file `services/influxdb.cfg`.
+
+| Property | Default | Required | Description |
+|----------|---------|:--------:|-------------|
+| url | http://127.0.0.1:8086 | No | database URL |
+| user | openhab | No | name of the database user, e.g. `openhab` |
+| password | | Yes | password of the database user that you chose in [Prerequisites](#prerequisites) above |
+| db | openhab | No | name of the database |
+| retentionPolicy | autogen | No | name of the retentionPolicy. Please note starting with InfluxDB >= 1.0, the default retention policy name is no longer `default` but `autogen`. |
+
+All item- and event-related configuration is defined in the file `persistence/influxdb.persist`.
diff --git a/bundles/org.openhab.persistence.influxdb/pom.xml b/bundles/org.openhab.persistence.influxdb/pom.xml
new file mode 100644
index 0000000000000..2ce4447280bf7
--- /dev/null
+++ b/bundles/org.openhab.persistence.influxdb/pom.xml
@@ -0,0 +1,47 @@
+
+
+
+ 4.0.0
+
+
+ org.openhab.addons.bundles
+ org.openhab.addons.reactor.bundles
+ 3.0.0-SNAPSHOT
+
+
+ org.openhab.persistence.influxdb
+
+ openHAB Add-ons :: Bundles :: Persistence Service :: InfluxDB
+
+
+ !android.*,!com.android.*,!com.google.appengine.*,!org.apache.harmony.*,!org.apache.http.*,!rx.*
+
+
+
+
+
+ org.influxdb
+ influxdb-java
+ 2.2
+
+
+
+ com.squareup.okhttp
+ okhttp
+ 2.4.0
+
+
+
+ com.squareup.okio
+ okio
+ 1.4.0
+
+
+
+ com.squareup.retrofit
+ retrofit
+ 1.9.0
+
+
+
diff --git a/bundles/org.openhab.persistence.influxdb/src/main/feature/feature.xml b/bundles/org.openhab.persistence.influxdb/src/main/feature/feature.xml
new file mode 100644
index 0000000000000..bee762db6a05f
--- /dev/null
+++ b/bundles/org.openhab.persistence.influxdb/src/main/feature/feature.xml
@@ -0,0 +1,11 @@
+
+
+ mvn:org.openhab.core.features.karaf/org.openhab.core.features.karaf.openhab-core/${ohc.version}/xml/features
+
+
+ openhab-runtime-base
+ mvn:org.openhab.addons.bundles/org.openhab.persistence.influxdb/${project.version}
+ mvn:${project.groupId}/openhab-addons-external3/${project.version}/cfg/influxdb
+
+
+
diff --git a/bundles/org.openhab.persistence.influxdb/src/main/java/org/openhab/persistence/influxdb/internal/InfluxDBPersistenceService.java b/bundles/org.openhab.persistence.influxdb/src/main/java/org/openhab/persistence/influxdb/internal/InfluxDBPersistenceService.java
new file mode 100644
index 0000000000000..e39de5c280214
--- /dev/null
+++ b/bundles/org.openhab.persistence.influxdb/src/main/java/org/openhab/persistence/influxdb/internal/InfluxDBPersistenceService.java
@@ -0,0 +1,601 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal;
+
+import static org.apache.commons.lang.StringUtils.isBlank;
+
+import java.math.BigDecimal;
+import java.util.ArrayList;
+import java.util.Calendar;
+import java.util.Collections;
+import java.util.Date;
+import java.util.List;
+import java.util.Locale;
+import java.util.Map;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+
+import org.eclipse.jdt.annotation.NonNull;
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+import org.influxdb.dto.Point;
+import org.influxdb.dto.Pong;
+import org.influxdb.dto.Query;
+import org.influxdb.dto.QueryResult.Result;
+import org.influxdb.dto.QueryResult.Series;
+import org.openhab.core.items.GroupItem;
+import org.openhab.core.items.Item;
+import org.openhab.core.items.ItemNotFoundException;
+import org.openhab.core.items.ItemRegistry;
+import org.openhab.core.library.items.ColorItem;
+import org.openhab.core.library.items.ContactItem;
+import org.openhab.core.library.items.DateTimeItem;
+import org.openhab.core.library.items.DimmerItem;
+import org.openhab.core.library.items.LocationItem;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.core.library.items.RollershutterItem;
+import org.openhab.core.library.items.SwitchItem;
+import org.openhab.core.library.types.DateTimeType;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.library.types.HSBType;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.library.types.OpenClosedType;
+import org.openhab.core.library.types.PercentType;
+import org.openhab.core.library.types.PointType;
+import org.openhab.core.library.types.StringType;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.FilterCriteria.Ordering;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.persistence.PersistenceItemInfo;
+import org.openhab.core.persistence.PersistenceService;
+import org.openhab.core.persistence.QueryablePersistenceService;
+import org.openhab.core.persistence.strategy.PersistenceStrategy;
+import org.openhab.core.types.State;
+import org.openhab.core.types.UnDefType;
+import org.osgi.framework.BundleContext;
+import org.osgi.service.component.annotations.Activate;
+import org.osgi.service.component.annotations.Component;
+import org.osgi.service.component.annotations.ConfigurationPolicy;
+import org.osgi.service.component.annotations.Deactivate;
+import org.osgi.service.component.annotations.Reference;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import retrofit.RetrofitError;
+
+/**
+ * This is the implementation of the InfluxDB {@link PersistenceService}. It persists item values
+ * using the InfluxDB time series database. The states (
+ * {@link State}) of an {@link Item} are persisted in a time series with names equal to the name of
+ * the item. All values are stored using integers or doubles, {@link OnOffType} and
+ * {@link OpenClosedType} are stored using 0 or 1.
+ *
+ * The defaults for the database name, the database user and the database url are "openhab",
+ * "openhab" and "http://127.0.0.1:8086".
+ *
+ * @author Theo Weiss - Initial contribution, rewrite of org.openhab.persistence.influxdb
+ * @author Kai Kreuzer - Migration to 3.x
+ */
+@NonNullByDefault
+@Component(service = { PersistenceService.class,
+ QueryablePersistenceService.class }, configurationPid = "org.openhab.influxdb", configurationPolicy = ConfigurationPolicy.REQUIRE)
+public class InfluxDBPersistenceService implements QueryablePersistenceService {
+
+ private static final String DEFAULT_URL = "http://127.0.0.1:8086";
+ private static final String DEFAULT_DB = "openhab";
+ private static final String DEFAULT_USER = "openhab";
+ private static final String DEFAULT_RETENTION_POLICY = "autogen";
+ private static final String DIGITAL_VALUE_OFF = "0";
+ private static final String DIGITAL_VALUE_ON = "1";
+ private static final String VALUE_COLUMN_NAME = "value";
+
+ private @NonNullByDefault({}) InfluxDB influxDB;
+ private static final Logger logger = LoggerFactory.getLogger(InfluxDBPersistenceService.class);
+ private static final String TIME_COLUMN_NAME = "time";
+ private static final TimeUnit timeUnit = TimeUnit.MILLISECONDS;
+ private @NonNullByDefault({}) String dbName;
+ private @NonNullByDefault({}) String url;
+ private @NonNullByDefault({}) String user;
+ private @Nullable String password;
+ private @NonNullByDefault({}) String retentionPolicy;
+ private boolean isProperlyConfigured;
+ private boolean connected;
+
+ @Reference
+ protected @NonNullByDefault({}) ItemRegistry itemRegistry;
+
+ @Activate
+ public void activate(final BundleContext bundleContext, final Map config) {
+ logger.debug("influxdb persistence service activated");
+ disconnect();
+ password = (String) config.get("password");
+ if (isBlank(password)) {
+ isProperlyConfigured = false;
+ logger.error(
+ "The password is missing. To specify a password configure the password parameter in influxdb.cfg.");
+ return;
+ }
+
+ url = (String) config.get("url");
+ if (isBlank(url)) {
+ url = DEFAULT_URL;
+ logger.debug("using default url {}", DEFAULT_URL);
+ }
+
+ user = (String) config.get("user");
+ if (isBlank(user)) {
+ user = DEFAULT_USER;
+ logger.debug("using default user {}", DEFAULT_USER);
+ }
+
+ dbName = (String) config.get("db");
+ if (isBlank(dbName)) {
+ dbName = DEFAULT_DB;
+ logger.debug("using default db name {}", DEFAULT_DB);
+ }
+ retentionPolicy = (String) config.get("retentionPolicy");
+ if (isBlank(retentionPolicy)) {
+ retentionPolicy = DEFAULT_RETENTION_POLICY;
+ logger.debug("using default retentionPolicy {}", DEFAULT_RETENTION_POLICY);
+ }
+ isProperlyConfigured = true;
+
+ connect();
+
+ // check connection; errors will only be logged, hoping the connection will work at a later
+ // time.
+ if (!checkConnection()) {
+ logger.error("database connection does not work for now, will retry to use the database.");
+ }
+ }
+
+ @Deactivate
+ public void deactivate() {
+ logger.debug("influxdb persistence service deactivated");
+ disconnect();
+ }
+
+ private void connect() {
+ if (influxDB == null) {
+ // reuse an existing InfluxDB object because concerning the database it has no state
+ // connection
+ influxDB = InfluxDBFactory.connect(url, user, password);
+ influxDB.enableBatch(200, 100, timeUnit);
+ }
+ connected = true;
+ }
+
+ private boolean checkConnection() {
+ boolean dbStatus = false;
+ if (!connected) {
+ logger.error("checkConnection: database is not connected");
+ dbStatus = false;
+ } else {
+ try {
+ Pong pong = influxDB.ping();
+ String version = pong.getVersion();
+ // may be check for version >= 0.9
+ if (version != null && !version.contains("unknown")) {
+ dbStatus = true;
+ logger.debug("database status is OK, version is {}", version);
+ } else {
+ logger.error("database ping error, version is: \"{}\" response time was \"{}\"", version,
+ pong.getResponseTime());
+ dbStatus = false;
+ }
+ } catch (RuntimeException e) {
+ dbStatus = false;
+ logger.error("database connection failed", e);
+ handleDatabaseException(e);
+ }
+ }
+ return dbStatus;
+ }
+
+ private void disconnect() {
+ influxDB = null;
+ connected = false;
+ }
+
+ private boolean isConnected() {
+ return connected;
+ }
+
+ @Override
+ public String getId() {
+ return "influxdb";
+ }
+
+ @Override
+ public String getLabel(@Nullable Locale locale) {
+ return "InfluxDB";
+ }
+
+ @Override
+ public @NonNull Set<@NonNull PersistenceItemInfo> getItemInfo() {
+ return Collections.emptySet();
+ }
+
+ @Override
+ public void store(Item item) {
+ store(item, null);
+ }
+
+ @Override
+ public void store(Item item, @Nullable String alias) {
+ if (item.getState() instanceof UnDefType) {
+ return;
+ }
+
+ if (!isProperlyConfigured) {
+ logger.warn("Configuration for influxdb not yet loaded or broken.");
+ return;
+ }
+
+ if (!isConnected()) {
+ logger.warn("InfluxDB is not yet connected");
+ return;
+ }
+
+ String realName = item.getName();
+ String name = (alias != null) ? alias : realName;
+
+ State state = null;
+ if (item.getAcceptedCommandTypes().contains(HSBType.class)) {
+ state = item.getStateAs(HSBType.class);
+ logger.trace("Tried to get item as {}, state is {}", HSBType.class, state.toString());
+ } else if (item.getAcceptedDataTypes().contains(PercentType.class)) {
+ state = item.getStateAs(PercentType.class);
+ logger.trace("Tried to get item as {}, state is {}", PercentType.class, state.toString());
+ } else {
+ // All other items should return the best format by default
+ state = item.getState();
+ logger.trace("Tried to get item from item class {}, state is {}", item.getClass(), state.toString());
+ }
+ Object value = stateToObject(state);
+ logger.trace("storing {} in influxdb value {}, {}", name, value, item);
+ Point point = Point.measurement(name).field(VALUE_COLUMN_NAME, value).time(System.currentTimeMillis(), timeUnit)
+ .build();
+ try {
+ influxDB.write(dbName, retentionPolicy, point);
+ } catch (RuntimeException e) {
+ logger.error("storing failed with exception for item: {}", name);
+ handleDatabaseException(e);
+ }
+ }
+
+ private void handleDatabaseException(Exception e) {
+ if (e instanceof RetrofitError) {
+ // e.g. raised if influxdb is not running
+ logger.error("database connection error {}", e.getMessage());
+ } else if (e instanceof RuntimeException) {
+ // e.g. raised by authentication errors
+ logger.error("database error: {}", e.getMessage());
+ }
+ }
+
+ @Override
+ public Iterable query(FilterCriteria filter) {
+ logger.debug("got a query");
+
+ if (!isProperlyConfigured) {
+ logger.warn("Configuration for influxdb not yet loaded or broken.");
+ return Collections.emptyList();
+ }
+
+ if (!isConnected()) {
+ logger.warn("InfluxDB is not yet connected");
+ return Collections.emptyList();
+ }
+
+ List historicItems = new ArrayList();
+
+ StringBuffer query = new StringBuffer();
+ query.append("select ").append(VALUE_COLUMN_NAME).append(' ').append("from \"").append(retentionPolicy)
+ .append("\".");
+
+ if (filter.getItemName() != null) {
+ query.append('"').append(filter.getItemName()).append('"');
+ } else {
+ query.append("/.*/");
+ }
+
+ logger.trace(
+ "Filter: itemname: {}, ordering: {}, state: {}, operator: {}, getBeginDate: {}, getEndDate: {}, getPageSize: {}, getPageNumber: {}",
+ filter.getItemName(), filter.getOrdering().toString(), filter.getState(), filter.getOperator(),
+ filter.getBeginDate(), filter.getEndDate(), filter.getPageSize(), filter.getPageNumber());
+
+ if ((filter.getState() != null && filter.getOperator() != null) || filter.getBeginDate() != null
+ || filter.getEndDate() != null) {
+ query.append(" where ");
+ boolean foundState = false;
+ boolean foundBeginDate = false;
+ if (filter.getState() != null && filter.getOperator() != null) {
+ String value = stateToString(filter.getState());
+ if (value != null) {
+ foundState = true;
+ query.append(VALUE_COLUMN_NAME);
+ query.append(" ");
+ query.append(filter.getOperator().toString());
+ query.append(" ");
+ query.append(value);
+ }
+ }
+
+ if (filter.getBeginDate() != null) {
+ foundBeginDate = true;
+ if (foundState) {
+ query.append(" and");
+ }
+ query.append(" ");
+ query.append(TIME_COLUMN_NAME);
+ query.append(" > ");
+ query.append(getTimeFilter(filter.getBeginDate()));
+ query.append(" ");
+ }
+
+ if (filter.getEndDate() != null) {
+ if (foundState || foundBeginDate) {
+ query.append(" and");
+ }
+ query.append(" ");
+ query.append(TIME_COLUMN_NAME);
+ query.append(" < ");
+ query.append(getTimeFilter(filter.getEndDate()));
+ query.append(" ");
+ }
+
+ }
+
+ if (filter.getOrdering() == Ordering.DESCENDING) {
+ query.append(String.format(" ORDER BY %s DESC", TIME_COLUMN_NAME));
+ logger.debug("descending ordering ");
+ }
+
+ int limit = (filter.getPageNumber() + 1) * filter.getPageSize();
+ query.append(" limit " + limit);
+ logger.trace("appending limit {}", limit);
+
+ int totalEntriesAffected = ((filter.getPageNumber() + 1) * filter.getPageSize());
+ int startEntryNum = totalEntriesAffected
+ - (totalEntriesAffected - (filter.getPageSize() * filter.getPageNumber()));
+ logger.trace("startEntryNum {}", startEntryNum);
+
+ logger.debug("query string: {}", query.toString());
+ Query influxdbQuery = new Query(query.toString(), dbName);
+
+ List results = Collections.emptyList();
+ results = influxDB.query(influxdbQuery, timeUnit).getResults();
+ for (Result result : results) {
+ List seriess = result.getSeries();
+ if (result.getError() != null) {
+ logger.error("{}", result.getError());
+ continue;
+ }
+ if (seriess == null) {
+ logger.debug("query returned no series");
+ } else {
+ for (Series series : seriess) {
+ logger.trace("series {}", series.toString());
+ String historicItemName = series.getName();
+ List> valuess = series.getValues();
+ if (valuess == null) {
+ logger.debug("query returned no values");
+ } else {
+ List columns = series.getColumns();
+ logger.trace("columns {}", columns);
+ Integer timestampColumn = null;
+ Integer valueColumn = null;
+ for (int i = 0; i < columns.size(); i++) {
+ String columnName = columns.get(i);
+ if (columnName.equals(TIME_COLUMN_NAME)) {
+ timestampColumn = i;
+ } else if (columnName.equals(VALUE_COLUMN_NAME)) {
+ valueColumn = i;
+ }
+ }
+ if (valueColumn == null || timestampColumn == null) {
+ throw new RuntimeException("missing column");
+ }
+ for (int i = 0; i < valuess.size(); i++) {
+ Double rawTime = (Double) valuess.get(i).get(timestampColumn);
+ Date time = new Date(rawTime.longValue());
+ State value = objectToState(valuess.get(i).get(valueColumn), historicItemName);
+ logger.trace("adding historic item {}: time {} value {}", historicItemName, time, value);
+ historicItems.add(new InfluxdbItem(historicItemName, value, time));
+ }
+ }
+ }
+ }
+ }
+ return historicItems;
+ }
+
+ private String getTimeFilter(Date time) {
+ // for some reason we need to query using 'seconds' only
+ // passing milli seconds causes no results to be returned
+ long milliSeconds = time.getTime();
+ long seconds = milliSeconds / 1000;
+ return seconds + "s";
+ }
+
+ /**
+ * This method returns an integer if possible if not a double is returned. This is an optimization
+ * for influxdb because integers have less overhead.
+ *
+ * @param value the BigDecimal to be converted
+ * @return A double if possible else a double is returned.
+ */
+ private Object convertBigDecimalToNum(BigDecimal value) {
+ Object convertedValue;
+ if (value.scale() == 0) {
+ logger.trace("found no fractional part");
+ convertedValue = value.toBigInteger();
+ } else {
+ logger.trace("found fractional part");
+ convertedValue = value.doubleValue();
+ }
+ return convertedValue;
+ }
+
+ /**
+ * Converts {@link State} to objects fitting into influxdb values.
+ *
+ * @param state to be converted
+ * @return integer or double value for DecimalType, 0 or 1 for OnOffType and OpenClosedType,
+ * integer for DateTimeType, String for all others
+ */
+ private Object stateToObject(State state) {
+ Object value;
+ if (state instanceof HSBType) {
+ value = ((HSBType) state).toString();
+ logger.debug("got HSBType value {}", value);
+ } else if (state instanceof PointType) {
+ value = point2String((PointType) state);
+ logger.debug("got PointType value {}", value);
+ } else if (state instanceof DecimalType) {
+ value = convertBigDecimalToNum(((DecimalType) state).toBigDecimal());
+ logger.debug("got DecimalType value {}", value);
+ } else if (state instanceof OnOffType) {
+ value = (OnOffType) state == OnOffType.ON ? 1 : 0;
+ logger.debug("got OnOffType value {}", value);
+ } else if (state instanceof OpenClosedType) {
+ value = (OpenClosedType) state == OpenClosedType.OPEN ? 1 : 0;
+ logger.debug("got OpenClosedType value {}", value);
+ } else if (state instanceof DateTimeType) {
+ value = ((DateTimeType) state).getCalendar().getTime().getTime();
+ logger.debug("got DateTimeType value {}", value);
+ } else {
+ value = state.toString();
+ logger.debug("got String value {}", value);
+ }
+ return value;
+ }
+
+ /**
+ * Converts {@link State} to a String suitable for influxdb queries.
+ *
+ * @param state to be converted
+ * @return {@link String} equivalent of the {@link State}
+ */
+ private String stateToString(State state) {
+ String value;
+ if (state instanceof DecimalType) {
+ value = ((DecimalType) state).toBigDecimal().toString();
+ } else if (state instanceof PointType) {
+ value = point2String((PointType) state);
+ } else if (state instanceof OnOffType) {
+ value = ((OnOffType) state) == OnOffType.ON ? DIGITAL_VALUE_ON : DIGITAL_VALUE_OFF;
+ } else if (state instanceof OpenClosedType) {
+ value = ((OpenClosedType) state) == OpenClosedType.OPEN ? DIGITAL_VALUE_ON : DIGITAL_VALUE_OFF;
+ } else if (state instanceof DateTimeType) {
+ value = String.valueOf(((DateTimeType) state).getCalendar().getTime().getTime());
+ } else {
+ value = state.toString();
+ }
+ return value;
+ }
+
+ /**
+ * Converts a value to a {@link State} which is suitable for the given {@link Item}. This is
+ * needed for querying a {@link HistoricState}.
+ *
+ * @param value to be converted to a {@link State}
+ * @param itemName name of the {@link Item} to get the {@link State} for
+ * @return the state of the item represented by the itemName parameter, else the string value of
+ * the Object parameter
+ */
+ private State objectToState(Object value, String itemName) {
+ String valueStr = String.valueOf(value);
+ if (itemRegistry != null) {
+ try {
+ Item item = itemRegistry.getItem(itemName);
+ if (item instanceof GroupItem) {
+ item = ((GroupItem) item).getBaseItem();
+ }
+ if (item instanceof ColorItem) {
+ logger.debug("objectToState found a ColorItem {}", valueStr);
+ return new HSBType(valueStr);
+ } else if (item instanceof LocationItem) {
+ logger.debug("objectToState found a LocationItem");
+ return new PointType(valueStr);
+ } else if (item instanceof NumberItem) {
+ logger.debug("objectToState found a NumberItem");
+ return new DecimalType(valueStr);
+ } else if (item instanceof DimmerItem) {
+ logger.debug("objectToState found a DimmerItem");
+ return new PercentType(valueStr);
+ } else if (item instanceof SwitchItem) {
+ logger.debug("objectToState found a SwitchItem");
+ return string2DigitalValue(valueStr).equals(DIGITAL_VALUE_OFF) ? OnOffType.OFF : OnOffType.ON;
+ } else if (item instanceof ContactItem) {
+ logger.debug("objectToState found a ContactItem");
+ return (string2DigitalValue(valueStr).equals(DIGITAL_VALUE_OFF)) ? OpenClosedType.CLOSED
+ : OpenClosedType.OPEN;
+ } else if (item instanceof RollershutterItem) {
+ logger.debug("objectToState found a RollershutterItem");
+ return new PercentType(valueStr);
+ } else if (item instanceof DateTimeItem) {
+ logger.debug("objectToState found a DateItem");
+ Calendar calendar = Calendar.getInstance();
+ calendar.setTimeInMillis(new BigDecimal(valueStr).longValue());
+ return new DateTimeType(calendar);
+ } else {
+ logger.debug("objectToState found a other Item");
+ return new StringType(valueStr);
+ }
+ } catch (ItemNotFoundException e) {
+ logger.warn("Could not find item '{}' in registry", itemName);
+ }
+ }
+ // just return a StringType as a fallback
+ return new StringType(valueStr);
+ }
+
+ /**
+ * Maps a string value which expresses a {@link BigDecimal.ZERO } to DIGITAL_VALUE_OFF, all others
+ * to DIGITAL_VALUE_ON
+ *
+ * @param value to be mapped
+ * @return
+ */
+ private String string2DigitalValue(String value) {
+ BigDecimal num = new BigDecimal(value);
+ if (num.compareTo(BigDecimal.ZERO) == 0) {
+ logger.trace("digitalvalue {}", DIGITAL_VALUE_OFF);
+ return DIGITAL_VALUE_OFF;
+ } else {
+ logger.trace("digitalvalue {}", DIGITAL_VALUE_ON);
+ return DIGITAL_VALUE_ON;
+ }
+ }
+
+ private String point2String(PointType point) {
+ StringBuilder buf = new StringBuilder();
+ buf.append(point.getLatitude().toString());
+ buf.append(",");
+ buf.append(point.getLongitude().toString());
+ if (!point.getAltitude().equals(DecimalType.ZERO)) {
+ buf.append(",");
+ buf.append(point.getAltitude().toString());
+ }
+ return buf.toString(); // latitude, longitude, altitude
+ }
+
+ @Override
+ public List getDefaultStrategies() {
+ return List.of(PersistenceStrategy.Globals.RESTORE, PersistenceStrategy.Globals.CHANGE);
+ }
+}
diff --git a/bundles/org.openhab.persistence.influxdb/src/main/java/org/openhab/persistence/influxdb/internal/InfluxdbItem.java b/bundles/org.openhab.persistence.influxdb/src/main/java/org/openhab/persistence/influxdb/internal/InfluxdbItem.java
new file mode 100644
index 0000000000000..b8d393e8c08a1
--- /dev/null
+++ b/bundles/org.openhab.persistence.influxdb/src/main/java/org/openhab/persistence/influxdb/internal/InfluxdbItem.java
@@ -0,0 +1,61 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal;
+
+import java.text.DateFormat;
+import java.util.Date;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.types.State;
+
+/**
+ * This is a Java bean used to return historic items from Influxdb.
+ *
+ * @author Theo Weiss - Initial Contribution
+ *
+ */
+@NonNullByDefault
+public class InfluxdbItem implements HistoricItem {
+
+ private final String name;
+ private final State state;
+ private final Date timestamp;
+
+ public InfluxdbItem(String name, State state, Date timestamp) {
+ this.name = name;
+ this.state = state;
+ this.timestamp = timestamp;
+ }
+
+ @Override
+ public String getName() {
+ return name;
+ }
+
+ @Override
+ public State getState() {
+ return state;
+ }
+
+ @Override
+ public Date getTimestamp() {
+ return timestamp;
+ }
+
+ @Override
+ public String toString() {
+ return DateFormat.getDateTimeInstance().format(timestamp) + ": " + name + " -> " + state.toString();
+ }
+
+}
diff --git a/bundles/org.openhab.persistence.jdbc/.classpath b/bundles/org.openhab.persistence.jdbc/.classpath
new file mode 100644
index 0000000000000..193f045c13ed4
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/.classpath
@@ -0,0 +1,32 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/bundles/org.openhab.persistence.jdbc/.gitignore b/bundles/org.openhab.persistence.jdbc/.gitignore
new file mode 100644
index 0000000000000..8d6d366083367
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/.gitignore
@@ -0,0 +1,2 @@
+/build/
+/drivers/
diff --git a/bundles/org.openhab.persistence.jdbc/.project b/bundles/org.openhab.persistence.jdbc/.project
new file mode 100644
index 0000000000000..e6f57f276ca93
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/.project
@@ -0,0 +1,23 @@
+
+
+ org.openhab.persistence.jdbc
+
+
+
+
+
+ org.eclipse.jdt.core.javabuilder
+
+
+
+
+ org.eclipse.m2e.core.maven2Builder
+
+
+
+
+
+ org.eclipse.jdt.core.javanature
+ org.eclipse.m2e.core.maven2Nature
+
+
diff --git a/bundles/org.openhab.persistence.jdbc/.settings/org.eclipse.core.resources.prefs b/bundles/org.openhab.persistence.jdbc/.settings/org.eclipse.core.resources.prefs
new file mode 100644
index 0000000000000..abdea9ac032d4
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/.settings/org.eclipse.core.resources.prefs
@@ -0,0 +1,4 @@
+eclipse.preferences.version=1
+encoding//src/main/java=UTF-8
+encoding//src/main/resources=UTF-8
+encoding/=UTF-8
diff --git a/bundles/org.openhab.persistence.jdbc/.settings/org.eclipse.jdt.core.prefs b/bundles/org.openhab.persistence.jdbc/.settings/org.eclipse.jdt.core.prefs
new file mode 100644
index 0000000000000..29fe717b8794b
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/.settings/org.eclipse.jdt.core.prefs
@@ -0,0 +1,11 @@
+eclipse.preferences.version=1
+org.eclipse.jdt.core.compiler.codegen.inlineJsrBytecode=enabled
+org.eclipse.jdt.core.compiler.codegen.targetPlatform=11
+org.eclipse.jdt.core.compiler.compliance=11
+org.eclipse.jdt.core.compiler.problem.assertIdentifier=error
+org.eclipse.jdt.core.compiler.problem.enablePreviewFeatures=disabled
+org.eclipse.jdt.core.compiler.problem.enumIdentifier=error
+org.eclipse.jdt.core.compiler.problem.forbiddenReference=warning
+org.eclipse.jdt.core.compiler.problem.reportPreviewFeatures=warning
+org.eclipse.jdt.core.compiler.release=disabled
+org.eclipse.jdt.core.compiler.source=11
diff --git a/bundles/org.openhab.persistence.jdbc/.settings/org.eclipse.jdt.ui.prefs b/bundles/org.openhab.persistence.jdbc/.settings/org.eclipse.jdt.ui.prefs
new file mode 100644
index 0000000000000..fe89f28bca590
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/.settings/org.eclipse.jdt.ui.prefs
@@ -0,0 +1,2 @@
+eclipse.preferences.version=1
+formatter_settings_version=12
diff --git a/bundles/org.openhab.persistence.jdbc/.settings/org.eclipse.m2e.core.prefs b/bundles/org.openhab.persistence.jdbc/.settings/org.eclipse.m2e.core.prefs
new file mode 100644
index 0000000000000..f897a7f1cb238
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/.settings/org.eclipse.m2e.core.prefs
@@ -0,0 +1,4 @@
+activeProfiles=
+eclipse.preferences.version=1
+resolveWorkspaceProjects=true
+version=1
diff --git a/bundles/org.openhab.persistence.jdbc/.settings/org.eclipse.pde.core.prefs b/bundles/org.openhab.persistence.jdbc/.settings/org.eclipse.pde.core.prefs
new file mode 100644
index 0000000000000..f29e940a0059c
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/.settings/org.eclipse.pde.core.prefs
@@ -0,0 +1,3 @@
+eclipse.preferences.version=1
+pluginProject.extensions=false
+resolve.requirebundle=false
diff --git a/bundles/org.openhab.persistence.jdbc/NOTICE b/bundles/org.openhab.persistence.jdbc/NOTICE
new file mode 100644
index 0000000000000..6c17d0d8a455b
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/NOTICE
@@ -0,0 +1,14 @@
+This content is produced and maintained by the openHAB project.
+
+* Project home: https://www.openhab.org
+
+== Declared Project Licenses
+
+This program and the accompanying materials are made available under the terms
+of the Eclipse Public License 2.0 which is available at
+https://www.eclipse.org/legal/epl-2.0/.
+
+== Source Code
+
+https://github.com/openhab/openhab-core
+
diff --git a/bundles/org.openhab.persistence.jdbc/README.md b/bundles/org.openhab.persistence.jdbc/README.md
new file mode 100644
index 0000000000000..aa7d08112df1d
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/README.md
@@ -0,0 +1,162 @@
+# JDBC Persistence
+
+This service writes and reads item states to and from a number of relational database systems that support [Java Database Connectivity (JDBC)](https://en.wikipedia.org/wiki/Java_Database_Connectivity). This service allows you to persist state updates using one of several different underlying database services. It is designed for a maximum of scalability, to store very large amounts of data and still over the years not lose its speed.
+
+The generic design makes it relatively easy for developers to integrate other databases that have JDBC drivers. The following databases are currently supported and tested:
+
+| Database | Tested Driver / Version |
+|----------------------------------------------|-------------------------|
+| [Apache Derby](https://db.apache.org/derby/) | [derby-10.12.1.1.jar](http://mvnrepository.com/artifact/org.apache.derby/derby) |
+| [H2](http://www.h2database.com/) | [h2-1.4.191.jar](http://mvnrepository.com/artifact/com.h2database/h2) |
+| [HSQLDB](http://hsqldb.org/) | [hsqldb-2.3.3.jar](http://mvnrepository.com/artifact/org.hsqldb/hsqldb) |
+| [MariaDB](https://mariadb.org/) | [mariadb-java-client-1.4.6.jar](http://mvnrepository.com/artifact/org.mariadb.jdbc/mariadb-java-client) |
+| [MySQL](https://www.mysql.com/) | [mysql-connector-java-5.1.39.jar](http://mvnrepository.com/artifact/mysql/mysql-connector-java) |
+| [PostgreSQL](http://www.postgresql.org/) | [postgresql-9.4.1209.jre7.jar](http://mvnrepository.com/artifact/org.postgresql/postgresql) |
+| [SQLite](https://www.sqlite.org/) | [sqlite-jdbc-3.16.1.jar](http://mvnrepository.com/artifact/org.xerial/sqlite-jdbc) |
+
+## Table of Contents
+
+
+
+- [Configuration](#configuration)
+ - [Minimal Configuration](#minimal-configuration)
+ - [Migration from MySQL to JDBC Persistence Services](#migration-from-mysql-to-jdbc-persistence-services)
+- [Technical Notes](#technical-notes)
+ - [Database Table Schema](#database-table-schema)
+ - [Number Precision](#number-precision)
+ - [Rounding results](#rounding-results)
+ - [For Developers](#for-developers)
+ - [Performance Tests](#performance-tests)
+
+
+
+## Configuration
+
+This service can be configured in the file `services/jdbc.cfg`.
+
+| Property | Default | Required | Description |
+|----------|---------|:--------:|-------------|
+| url | | Yes | JDBC URL to establish a connection to your database. Examples: `jdbc:derby:./testDerby;create=true` `jdbc:h2:./testH2` `jdbc:hsqldb:./testHsqlDb` `jdbc:mariadb://192.168.0.1:3306/testMariadb` `jdbc:mysql://192.168.0.1:3306/testMysql?serverTimezone=UTC` `jdbc:postgresql://192.168.0.1:5432/testPostgresql` `jdbc:sqlite:./testSqlite.db`. If no database is available it will be created; for example the url `jdbc:h2:./testH2` creates a new H2 database in openHAB folder. Example to create your own MySQL database directly: `CREATE DATABASE 'yourDB' CHARACTER SET utf8 COLLATE utf8_general_ci;`|
+| user | | if needed | database user name |
+| password | | if needed | database user password |
+| errReconnectThreshold | 0 | No | when the service is deactivated (0 means ignore) |
+| sqltype.CALL | `VARCHAR(200)` | No | All `sqlType` options allow you to change the SQL data type used to store values for different openHAB item states. See the following links for further information: [mybatis](https://mybatis.github.io/mybatis-3/apidocs/reference/org/apache/ibatis/type/JdbcType.html) [H2](http://www.h2database.com/html/datatypes.html) [PostgresSQL](http://www.postgresql.org/docs/9.3/static/datatype.html) |
+| sqltype.COLOR | `VARCHAR(70)` | No | see above |
+| sqltype.CONTACT | `VARCHAR(6)` | No | see above |
+| sqltype.DATETIME | `DATETIME` | No | see above |
+| sqltype.DIMMER | `TINYINT` | No | see above |
+| sqltype.LOCATION | `VARCHAR(30)` | No | see above |
+| sqltype.NUMBER | `DOUBLE` | No | see above |
+| sqltype.ROLLERSHUTTER | `TINYINT` | No | see above |
+| sqltype.STRING | `VARCHAR(65500)` | No | see above |
+| sqltype.SWITCH | `VARCHAR(6)` | No | see above |
+| sqltype.tablePrimaryKey | `TIMESTAMP` | No | type of `time` column for newly created item tables |
+| sqltype.tablePrimaryValue | `NOW()` | No | value of `time` column for newly inserted rows |
+| numberDecimalcount | 3 | No | for Itemtype "Number" default decimal digit count |
+| tableNamePrefix | `item` | No | table name prefix. For Migration from MySQL Persistence, set to `Item`. |
+| tableUseRealItemNames | `false` | No | table name prefix generation. When set to `true`, real item names are used for table names and `tableNamePrefix` is ignored. When set to `false`, the `tableNamePrefix` is used to generate table names with sequential numbers. |
+| tableIdDigitCount | 4 | No | when `tableUseRealItemNames` is `false` and thus table names are generated sequentially, this controls how many zero-padded digits are used in the table name. With the default of 4, the first table name will end with `0001`. For migration from the MySQL persistence service, set this to 0. |
+| rebuildTableNames | false | No | rename existing tables using `tableUseRealItemNames` and `tableIdDigitCount`. USE WITH CARE! Deactivate after Renaming is done! |
+| jdbc.maximumPoolSize | configured per database in package `org.openhab.persistence.jdbc.db.*` | No | Some embeded databases can handle only one connection. See [this link](https://github.com/brettwooldridge/HikariCP/issues/256) for more information |
+| jdbc.minimumIdle | see above | No | see above |
+| enableLogTime | `false` | No | timekeeping |
+
+All item- and event-related configuration is done in the file `persistence/jdbc.persist`.
+
+To configure this service as the default persistence service for openHAB 2, add or change the line
+
+```
+org.openhab.core.persistence:default=jdbc
+```
+
+in the file `services/runtime.cfg`.
+
+### Minimal Configuration
+
+services/jdbc.cfg
+
+```
+url=jdbc:postgresql://192.168.0.1:5432/testPostgresql
+```
+
+### Migration from MySQL to JDBC Persistence Services
+
+The JDBC Persistence service can act as a replacement for the MySQL Persistence service. Here is an example of a configuration for a MySQL database named `testMysql` with user `test` and password `test`:
+
+services/jdbc.cfg
+
+```
+url=jdbc:mysql://192.168.0.1:3306/testMysql
+user=test
+password=test
+tableNamePrefix=Item
+tableUseRealItemNames=false
+tableIdDigitCount=0
+```
+
+Remember to install and uninstall the services you want, and rename `persistence/mysql.persist` to `persistence/jdbc.persist`.
+
+## Technical Notes
+
+### Database Table Schema
+
+The table name schema can be reconfigured after creation, if needed.
+
+The service will create a mapping table to link each item to a table, and a separate table is generated for each item. The item data tables include time and data values. The SQL data type used depends on the openHAB item type, and allows the item state to be recovered back into openHAB in the same way it was stored.
+
+With this *per-item* layout, the scalability and easy maintenance of the database is ensured, even if large amounts of data must be managed. To rename existing tables, use the parameters `tableUseRealItemNames` and `tableIdDigitCount` in the configuration.
+
+### Number Precision
+
+Default openHAB number items are persisted with SQL datatype `double`. Internally openHAB uses `BigDecimal`. If better numerical precision is needed, for example set `sqltype.NUMBER = DECIMAL(max digits, max decimals)`, then on the Java side, the service works with `BigDecimal` without type conversion. If more come decimals as `max decimals` provides, this persisted value is rounded mathematically correctly. The SQL types `DECIMAL` or `NUMERIC` are precise, but to work with `DOUBLE` is faster.
+
+### Rounding results
+
+The results of database queries of number items are rounded to three decimal places by default. With `numberDecimalcount` decimals can be changed. Especially if sql types `DECIMAL` or `NUMERIC` are used for `sqltype.NUMBER`, rounding can be disabled by setting `numberDecimalcount=-1`.
+
+### For Developers
+
+* Clearly separated source files for the database-specific part of openHAB logic.
+* Code duplication by similar services is prevented.
+* Integrating a new SQL and JDBC enabled database is fairly simple.
+
+### Performance Tests
+
+Not necessarily representative of the performance you may experience.
+
+DATABASE | FIRST RUN | AVERAGE | FASTEST | SIZE AFTER | COMMENT
+-------- | --------: | ------: | ------: | ---------: | --------
+Derby | 7.829 | 6.892 | 5.381 | 5.36 MB | local embedded
+H2 | 1.797 | 2.080 | 1.580 | 0.96 MB | local embedded
+hsqldb | 3.474 | 2.104 | 1.310 | 1.23 MB | local embedded
+mysql | 11.873 | 11.524 | 10.971 | - | ext. Server VM
+postgresql | 8.147 | 7.072 | 6.895 | - | ext. Server VM
+sqlite | 2.406 | 1.249 | 1.137 | 0.28 MB| local embedded
+
+* Each test ran about 20 Times every 30 seconds.
+* openHAB 1.x has ready started for about a Minute.
+* the data in seconds for the evaluation are from the console output.
+
+Used a script like this:
+
+```
+var count = 0;
+rule "DB STRESS TEST"
+when
+ Time cron "30 * * * * ?"
+then
+ if( count = 24) count = 0
+ count = count+1
+ if( count > 3 && count < 23){
+ for( var i=500; i>1; i=i-1){
+ postUpdate( NUMBERITEM, i)
+ SWITCHITEM.previousState().state
+ postUpdate( DIMMERITEM, OFF)
+ NUMBERITEM.changedSince( now().minusMinutes(1))
+ postUpdate( DIMMERITEM, ON)
+ }
+ }
+end
+```
+
+
diff --git a/bundles/org.openhab.persistence.jdbc/pom.xml b/bundles/org.openhab.persistence.jdbc/pom.xml
new file mode 100644
index 0000000000000..479755fd4d4ce
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/pom.xml
@@ -0,0 +1,91 @@
+
+
+
+ 4.0.0
+
+
+ org.openhab.addons.bundles
+ org.openhab.addons.reactor.bundles
+ 3.0.0-SNAPSHOT
+
+
+ org.openhab.persistence.jdbc
+
+ openHAB Add-ons :: Bundles :: Persistence Service :: JDBC
+
+
+ !org.osgi.service.jdbc.*,!sun.security.*,!org.apache.lucene.*,!org.apache.logging.log4j,!waffle.windows.auth.*,!org.hibernate.*,!org.jboss.*,!org.codehaus.groovy.*,!com.codahale.metrics.*,!com.google.protobuf.*,!com.ibm.icu.*,!com.ibm.jvm.*,!com.mchange.*,!com.sun.*,!com.vividsolutions.*,!io.prometheus.*,com.mysql.jdbc;resolution:=optional,org.apache.derby.*;resolution:=optional,org.h2;resolution:=optional,org.h2.jdbcx;resolution:=optional,org.hsqldb;resolution:=optional,org.hsqldb.jdbc;resolution:=optional,org.mariadb.jdbc;resolution:=optional,org.postgresql;resolution:=optional,org.sqlite;resolution:=optional,org.sqlite.jdbc4;resolution:=optional
+
+ UTF-8
+ UTF-8
+ 2.4.7
+ 1.6
+ 3.2.0
+
+
+ 10.12.1.1
+ 1.4.191
+ 2.3.3
+ 1.3.5
+ 8.0.13
+ 9.4.1212
+ 3.16.1
+
+
+
+
+ commons-dbutils
+ commons-dbutils
+ ${dbutils.version}
+
+
+ com.zaxxer
+ HikariCP
+ ${hikari.version}
+
+
+ org.knowm
+ yank
+ ${yank.version}
+
+
+
+
+ org.apache.derby
+ derby
+ ${derby.version}
+
+
+ com.h2database
+ h2
+ ${h2.version}
+
+
+ org.hsqldb
+ hsqldb
+ ${hsqldb.version}
+
+
+ org.mariadb.jdbc
+ mariadb-java-client
+ ${mariadb.version}
+
+
+ mysql
+ mysql-connector-java
+ ${mysql.version}
+
+
+ org.postgresql
+ postgresql
+ ${postgresql.version}
+
+
+ org.xerial
+ sqlite-jdbc
+ ${sqlite.version}
+
+
+
+
+
diff --git a/bundles/org.openhab.persistence.jdbc/src/main/feature/feature.xml b/bundles/org.openhab.persistence.jdbc/src/main/feature/feature.xml
new file mode 100644
index 0000000000000..62894575f100a
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/src/main/feature/feature.xml
@@ -0,0 +1,55 @@
+
+
+ mvn:org.openhab.core.features.karaf/org.openhab.core.features.karaf.openhab-core/${ohc.version}/xml/features
+
+
+
+ mvn:${project.groupId}/openhab-addons-external3/${project.version}/cfg/jdbc
+ openhab-runtime-base
+ mvn:org.apache.derby/derbyclient/${derby.version}
+ mvn:org.openhab.addons.bundles/org.openhab.persistence.jdbc/${project.version}
+
+
+
+ mvn:${project.groupId}/openhab-addons-external3/${project.version}/cfg/jdbc
+ openhab-runtime-base
+ mvn:com.h2database/h2/${h2.version}
+ mvn:org.openhab.addons.bundles/org.openhab.persistence.jdbc/${project.version}
+
+
+
+ mvn:${project.groupId}/openhab-addons-external3/${project.version}/cfg/jdbc
+ openhab-runtime-base
+ mvn:org.hsqldb/hsqldb/${hsqldb.version}
+ mvn:org.openhab.addons.bundles/org.openhab.persistence.jdbc/${project.version}
+
+
+
+ mvn:${project.groupId}/openhab-addons-external3/${project.version}/cfg/jdbc
+ openhab-runtime-base
+ mvn:org.mariadb.jdbc/mariadb-java-client/${mariadb.version}
+ mvn:org.openhab.addons.bundles/org.openhab.persistence.jdbc/${project.version}
+
+
+
+ mvn:${project.groupId}/openhab-addons-external3/${project.version}/cfg/jdbc
+ openhab-runtime-base
+ mvn:mysql/mysql-connector-java/${mysql.version}
+ mvn:org.openhab.addons.bundles/org.openhab.persistence.jdbc/${project.version}
+
+
+
+ mvn:${project.groupId}/openhab-addons-external3/${project.version}/cfg/jdbc
+ openhab-runtime-base
+ mvn:org.postgresql/postgresql/${postgresql.version}
+ mvn:org.openhab.addons.bundles/org.openhab.persistence.jdbc/${project.version}
+
+
+
+ mvn:${project.groupId}/openhab-addons-external3/${project.version}/cfg/jdbc
+ openhab-runtime-base
+ mvn:org.xerial/sqlite-jdbc/${sqlite.version}
+ mvn:org.openhab.addons.bundles/org.openhab.persistence.jdbc/${project.version}
+
+
+
diff --git a/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/db/JdbcBaseDAO.java b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/db/JdbcBaseDAO.java
new file mode 100644
index 0000000000000..77fa1c664158e
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/db/JdbcBaseDAO.java
@@ -0,0 +1,574 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.db;
+
+import java.math.BigDecimal;
+import java.sql.Timestamp;
+import java.time.format.DateTimeFormatter;
+import java.util.ArrayList;
+import java.util.Calendar;
+import java.util.Date;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Properties;
+
+import org.knowm.yank.Yank;
+import org.openhab.core.items.GroupItem;
+import org.openhab.core.items.Item;
+import org.openhab.core.library.items.ColorItem;
+import org.openhab.core.library.items.ContactItem;
+import org.openhab.core.library.items.DateTimeItem;
+import org.openhab.core.library.items.DimmerItem;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.core.library.items.RollershutterItem;
+import org.openhab.core.library.items.StringItem;
+import org.openhab.core.library.items.SwitchItem;
+import org.openhab.core.library.types.DateTimeType;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.library.types.HSBType;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.library.types.OpenClosedType;
+import org.openhab.core.library.types.PercentType;
+import org.openhab.core.library.types.StringType;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.FilterCriteria.Ordering;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.types.State;
+import org.openhab.persistence.jdbc.model.ItemVO;
+import org.openhab.persistence.jdbc.model.ItemsVO;
+import org.openhab.persistence.jdbc.model.JdbcItem;
+import org.openhab.persistence.jdbc.utils.DbMetaData;
+import org.openhab.persistence.jdbc.utils.StringUtilsExt;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Default Database Configuration class.
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class JdbcBaseDAO {
+ private static final Logger logger = LoggerFactory.getLogger(JdbcBaseDAO.class);
+
+ public Properties databaseProps = new Properties();
+ protected String urlSuffix = "";
+ public Map sqlTypes = new HashMap();
+
+ // Get Database Meta data
+ protected DbMetaData dbMeta;
+
+ protected String SQL_PING_DB;
+ protected String SQL_GET_DB;
+ protected String SQL_IF_TABLE_EXISTS;
+ protected String SQL_CREATE_NEW_ENTRY_IN_ITEMS_TABLE;
+ protected String SQL_CREATE_ITEMS_TABLE_IF_NOT;
+ protected String SQL_DELETE_ITEMS_ENTRY;
+ protected String SQL_GET_ITEMID_TABLE_NAMES;
+ protected String SQL_GET_ITEM_TABLES;
+ protected String SQL_CREATE_ITEM_TABLE;
+ protected String SQL_INSERT_ITEM_VALUE;
+
+ /********
+ * INIT *
+ ********/
+ public JdbcBaseDAO() {
+ initSqlTypes();
+ initDbProps();
+ initSqlQueries();
+ }
+
+ /**
+ * ## Get high precision by fractal seconds, examples ##
+ *
+ * mysql > 5.5 + mariadb > 5.2:
+ * DROP TABLE FractionalSeconds;
+ * CREATE TABLE FractionalSeconds (time TIMESTAMP(3), value TIMESTAMP(3));
+ * INSERT INTO FractionalSeconds (time, value) VALUES( NOW(3), '1999-01-09 20:11:11.126' );
+ * SELECT time FROM FractionalSeconds ORDER BY time DESC LIMIT 1;
+ *
+ * mysql <= 5.5 + mariadb <= 5.2: !!! NO high precision and fractal seconds !!!
+ * DROP TABLE FractionalSeconds;
+ * CREATE TABLE FractionalSeconds (time TIMESTAMP, value TIMESTAMP);
+ * INSERT INTO FractionalSeconds (time, value) VALUES( NOW(), '1999-01-09 20:11:11.126' );
+ * SELECT time FROM FractionalSeconds ORDER BY time DESC LIMIT 1;
+ *
+ * derby:
+ * DROP TABLE FractionalSeconds;
+ * CREATE TABLE FractionalSeconds (time TIMESTAMP, value TIMESTAMP);
+ * INSERT INTO FractionalSeconds (time, value) VALUES( CURRENT_TIMESTAMP, '1999-01-09 20:11:11.126' );
+ * SELECT time, value FROM FractionalSeconds;
+ *
+ * H2 + postgreSQL + hsqldb:
+ * DROP TABLE FractionalSeconds;
+ * CREATE TABLE FractionalSeconds (time TIMESTAMP, value TIMESTAMP);
+ * INSERT INTO FractionalSeconds (time, value) VALUES( NOW(), '1999-01-09 20:11:11.126' );
+ * SELECT time, value FROM FractionalSeconds;
+ *
+ * Sqlite:
+ * DROP TABLE FractionalSeconds;
+ * CREATE TABLE FractionalSeconds (time TIMESTAMP, value TIMESTAMP);
+ * INSERT INTO FractionalSeconds (time, value) VALUES( strftime('%Y-%m-%d %H:%M:%f' , 'now' , 'localtime'),
+ * '1999-01-09 20:11:11.124' );
+ * SELECT time FROM FractionalSeconds ORDER BY time DESC LIMIT 1;
+ *
+ */
+
+ private void initSqlQueries() {
+ logger.debug("JDBC::initSqlQueries: '{}'", this.getClass().getSimpleName());
+ SQL_PING_DB = "SELECT 1";
+ SQL_GET_DB = "SELECT DATABASE()";
+ SQL_IF_TABLE_EXISTS = "SHOW TABLES LIKE '#searchTable#'";
+
+ SQL_CREATE_NEW_ENTRY_IN_ITEMS_TABLE = "INSERT INTO #itemsManageTable# (ItemName) VALUES ('#itemname#')";
+ SQL_CREATE_ITEMS_TABLE_IF_NOT = "CREATE TABLE IF NOT EXISTS #itemsManageTable# (ItemId INT NOT NULL AUTO_INCREMENT,#colname# #coltype# NOT NULL,PRIMARY KEY (ItemId))";
+ SQL_DELETE_ITEMS_ENTRY = "DELETE FROM items WHERE ItemName=#itemname#";
+ SQL_GET_ITEMID_TABLE_NAMES = "SELECT itemid, itemname FROM #itemsManageTable#";
+ SQL_GET_ITEM_TABLES = "SELECT table_name FROM information_schema.tables WHERE table_type='BASE TABLE' AND table_schema='#jdbcUriDatabaseName#' AND NOT table_name='#itemsManageTable#'";
+ SQL_CREATE_ITEM_TABLE = "CREATE TABLE IF NOT EXISTS #tableName# (time #tablePrimaryKey# NOT NULL, value #dbType#, PRIMARY KEY(time))";
+ SQL_INSERT_ITEM_VALUE = "INSERT INTO #tableName# (TIME, VALUE) VALUES( #tablePrimaryValue#, ? ) ON DUPLICATE KEY UPDATE VALUE= ?";
+ }
+
+ /**
+ * INFO: http://www.java2s.com/Code/Java/Database-SQL-JDBC/StandardSQLDataTypeswithTheirJavaEquivalents.htm
+ */
+ private void initSqlTypes() {
+ logger.debug("JDBC::initSqlTypes: Initialize the type array");
+ sqlTypes.put("CALLITEM", "VARCHAR(200)");
+ sqlTypes.put("COLORITEM", "VARCHAR(70)");
+ sqlTypes.put("CONTACTITEM", "VARCHAR(6)");
+ sqlTypes.put("DATETIMEITEM", "TIMESTAMP");
+ sqlTypes.put("DIMMERITEM", "TINYINT");
+ sqlTypes.put("LOCATIONITEM", "VARCHAR(30)");
+ sqlTypes.put("NUMBERITEM", "DOUBLE");
+ sqlTypes.put("ROLLERSHUTTERITEM", "TINYINT");
+ sqlTypes.put("STRINGITEM", "VARCHAR(65500)");// jdbc max 21845
+ sqlTypes.put("SWITCHITEM", "VARCHAR(6)");
+ sqlTypes.put("tablePrimaryKey", "TIMESTAMP");
+ sqlTypes.put("tablePrimaryValue", "NOW()");
+ }
+
+ /**
+ * INFO: https://github.com/brettwooldridge/HikariCP
+ *
+ * driverClassName (used with jdbcUrl):
+ * Derby: org.apache.derby.jdbc.EmbeddedDriver
+ * H2: org.h2.Driver
+ * HSQLDB: org.hsqldb.jdbcDriver
+ * Jaybird: org.firebirdsql.jdbc.FBDriver
+ * MariaDB: org.mariadb.jdbc.Driver
+ * MySQL: com.mysql.jdbc.Driver
+ * MaxDB: com.sap.dbtech.jdbc.DriverSapDB
+ * PostgreSQL: org.postgresql.Driver
+ * SyBase: com.sybase.jdbc3.jdbc.SybDriver
+ * SqLite: org.sqlite.JDBC
+ *
+ * dataSourceClassName (for alternative Configuration):
+ * Derby: org.apache.derby.jdbc.ClientDataSource
+ * H2: org.h2.jdbcx.JdbcDataSource
+ * HSQLDB: org.hsqldb.jdbc.JDBCDataSource
+ * Jaybird: org.firebirdsql.pool.FBSimpleDataSource
+ * MariaDB, MySQL: org.mariadb.jdbc.MySQLDataSource
+ * MaxDB: com.sap.dbtech.jdbc.DriverSapDB
+ * PostgreSQL: org.postgresql.ds.PGSimpleDataSource
+ * SyBase: com.sybase.jdbc4.jdbc.SybDataSource
+ * SqLite: org.sqlite.SQLiteDataSource
+ *
+ * HikariPool - configuration Example:
+ * allowPoolSuspension.............false
+ * autoCommit......................true
+ * catalog.........................
+ * connectionInitSql...............
+ * connectionTestQuery.............
+ * connectionTimeout...............30000
+ * dataSource......................
+ * dataSourceClassName.............
+ * dataSourceJNDI..................
+ * dataSourceProperties............{password=}
+ * driverClassName.................
+ * healthCheckProperties...........{}
+ * healthCheckRegistry.............
+ * idleTimeout.....................600000
+ * initializationFailFast..........true
+ * isolateInternalQueries..........false
+ * jdbc4ConnectionTest.............false
+ * jdbcUrl.........................jdbc:mysql://192.168.0.1:3306/test
+ * leakDetectionThreshold..........0
+ * maxLifetime.....................1800000
+ * maximumPoolSize.................10
+ * metricRegistry..................
+ * metricsTrackerFactory...........
+ * minimumIdle.....................10
+ * password........................
+ * poolName........................HikariPool-0
+ * readOnly........................false
+ * registerMbeans..................false
+ * scheduledExecutorService........
+ * threadFactory...................
+ * transactionIsolation............
+ * username........................xxxx
+ * validationTimeout...............5000
+ */
+ private void initDbProps() {
+ // databaseProps.setProperty("dataSource.url", "jdbc:mysql://192.168.0.1:3306/test");
+ // databaseProps.setProperty("dataSource.user", "test");
+ // databaseProps.setProperty("dataSource.password", "test");
+
+ // Most relevant Performance values
+ // maximumPoolSize to 20, minimumIdle to 5, and idleTimeout to 2 minutes.
+ // databaseProps.setProperty("maximumPoolSize", ""+maximumPoolSize);
+ // databaseProps.setProperty("minimumIdle", ""+minimumIdle);
+ // databaseProps.setProperty("idleTimeout", ""+idleTimeout);
+ // databaseProps.setProperty("connectionTimeout",""+connectionTimeout);
+ // databaseProps.setProperty("idleTimeout", ""+idleTimeout);
+ // databaseProps.setProperty("maxLifetime", ""+maxLifetime);
+ // databaseProps.setProperty("validationTimeout",""+validationTimeout);
+
+ }
+
+ public void initAfterFirstDbConnection() {
+ logger.debug("JDBC::initAfterFirstDbConnection: Initializing step, after db is connected.");
+ // Initialize sqlTypes, depending on DB version for example
+ dbMeta = new DbMetaData();// get DB information
+ }
+
+ /**************
+ * ITEMS DAOs *
+ **************/
+ public Integer doPingDB() {
+ return Yank.queryScalar(SQL_PING_DB, Integer.class, null);
+ }
+
+ public String doGetDB() {
+ return Yank.queryScalar(SQL_GET_DB, String.class, null);
+ }
+
+ public boolean doIfTableExists(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(SQL_IF_TABLE_EXISTS, new String[] { "#searchTable#" },
+ new String[] { vo.getItemsManageTable() });
+ logger.debug("JDBC::doIfTableExists sql={}", sql);
+ return Yank.queryScalar(sql, String.class, null) != null;
+ }
+
+ public Long doCreateNewEntryInItemsTable(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(SQL_CREATE_NEW_ENTRY_IN_ITEMS_TABLE,
+ new String[] { "#itemsManageTable#", "#itemname#" },
+ new String[] { vo.getItemsManageTable(), vo.getItemname() });
+ logger.debug("JDBC::doCreateNewEntryInItemsTable sql={}", sql);
+ return Yank.insert(sql, null);
+ }
+
+ public ItemsVO doCreateItemsTableIfNot(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(SQL_CREATE_ITEMS_TABLE_IF_NOT,
+ new String[] { "#itemsManageTable#", "#colname#", "#coltype#" },
+ new String[] { vo.getItemsManageTable(), vo.getColname(), vo.getColtype() });
+ logger.debug("JDBC::doCreateItemsTableIfNot sql={}", sql);
+ Yank.execute(sql, null);
+ return vo;
+ }
+
+ public void doDeleteItemsEntry(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(SQL_DELETE_ITEMS_ENTRY, new String[] { "#itemname#" },
+ new String[] { vo.getItemname() });
+ logger.debug("JDBC::doDeleteItemsEntry sql={}", sql);
+ Yank.execute(sql, null);
+ }
+
+ public List doGetItemIDTableNames(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(SQL_GET_ITEMID_TABLE_NAMES, new String[] { "#itemsManageTable#" },
+ new String[] { vo.getItemsManageTable() });
+ logger.debug("JDBC::doGetItemIDTableNames sql={}", sql);
+ return Yank.queryBeanList(sql, ItemsVO.class, null);
+ }
+
+ public List doGetItemTables(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(SQL_GET_ITEM_TABLES,
+ new String[] { "#jdbcUriDatabaseName#", "#itemsManageTable#" },
+ new String[] { vo.getJdbcUriDatabaseName(), vo.getItemsManageTable() });
+ logger.debug("JDBC::doGetItemTables sql={}", sql);
+ return Yank.queryBeanList(sql, ItemsVO.class, null);
+ }
+
+ /*************
+ * ITEM DAOs *
+ *************/
+ public void doUpdateItemTableNames(List vol) {
+ String sql = updateItemTableNamesProvider(vol);
+ Yank.execute(sql, null);
+ }
+
+ public void doCreateItemTable(ItemVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(SQL_CREATE_ITEM_TABLE,
+ new String[] { "#tableName#", "#dbType#", "#tablePrimaryKey#" },
+ new String[] { vo.getTableName(), vo.getDbType(), sqlTypes.get("tablePrimaryKey") });
+ logger.debug("JDBC::doCreateItemTable sql={}", sql);
+ Yank.execute(sql, null);
+ }
+
+ public void doStoreItemValue(Item item, ItemVO vo) {
+ vo = storeItemValueProvider(item, vo);
+ String sql = StringUtilsExt.replaceArrayMerge(SQL_INSERT_ITEM_VALUE,
+ new String[] { "#tableName#", "#tablePrimaryValue#" },
+ new String[] { vo.getTableName(), sqlTypes.get("tablePrimaryValue") });
+ Object[] params = new Object[] { vo.getValue(), vo.getValue() };
+ logger.debug("JDBC::doStoreItemValue sql={} value='{}'", sql, vo.getValue());
+ Yank.execute(sql, params);
+ }
+
+ public List doGetHistItemFilterQuery(Item item, FilterCriteria filter, int numberDecimalcount,
+ String table, String name) {
+ String sql = histItemFilterQueryProvider(filter, numberDecimalcount, table, name);
+ logger.debug("JDBC::doGetHistItemFilterQuery sql={}", sql);
+ List m = Yank.queryObjectArrays(sql, null);
+
+ List items = new ArrayList();
+ for (int i = 0; i < m.size(); i++) {
+ items.add(new JdbcItem(item.getName(), getState(item, m.get(i)[1]), objectAsDate(m.get(i)[0])));
+ }
+ return items;
+ }
+
+ /*************
+ * Providers *
+ *************/
+ static final DateTimeFormatter jdbcDateFormat = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss");
+
+ private String histItemFilterQueryProvider(FilterCriteria filter, int numberDecimalcount, String table,
+ String simpleName) {
+ logger.debug(
+ "JDBC::getHistItemFilterQueryProvider filter = {}, numberDecimalcount = {}, table = {}, simpleName = {}",
+ filter.toString(), numberDecimalcount, table, simpleName);
+
+ String filterString = "";
+ if (filter.getBeginDate() != null) {
+ filterString += filterString.isEmpty() ? " WHERE" : " AND";
+ filterString += " TIME>'" + jdbcDateFormat.format(filter.getBeginDateZoned()) + "'";
+ }
+ if (filter.getEndDate() != null) {
+ filterString += filterString.isEmpty() ? " WHERE" : " AND";
+ filterString += " TIME<'" + jdbcDateFormat.format(filter.getEndDateZoned()) + "'";
+ }
+ filterString += (filter.getOrdering() == Ordering.ASCENDING) ? " ORDER BY time ASC" : " ORDER BY time DESC ";
+ if (filter.getPageSize() != 0x7fffffff) {
+ filterString += " LIMIT " + filter.getPageNumber() * filter.getPageSize() + "," + filter.getPageSize();
+ }
+ // SELECT time, ROUND(value,3) FROM number_item_0114 ORDER BY time DESC LIMIT 0,1
+ // rounding HALF UP
+ String queryString = "NUMBERITEM".equalsIgnoreCase(simpleName) && numberDecimalcount > -1
+ ? "SELECT time, ROUND(value," + numberDecimalcount + ") FROM " + table
+ : "SELECT time, value FROM " + table;
+ if (!filterString.isEmpty()) {
+ queryString += filterString;
+ }
+ logger.debug("JDBC::query queryString = {}", queryString);
+ return queryString;
+ }
+
+ private String updateItemTableNamesProvider(List namesList) {
+ logger.debug("JDBC::updateItemTableNamesProvider namesList.size = {}", namesList.size());
+ String queryString = "";
+ for (int i = 0; i < namesList.size(); i++) {
+ ItemVO it = namesList.get(i);
+ queryString += "ALTER TABLE " + it.getTableName() + " RENAME TO " + it.getNewTableName() + ";";
+ }
+ logger.debug("JDBC::query queryString = {}", queryString);
+ return queryString;
+ }
+
+ protected ItemVO storeItemValueProvider(Item item, ItemVO vo) {
+ String itemType = getItemType(item);
+
+ logger.debug("JDBC::storeItemValueProvider: item '{}' as Type '{}' in '{}' with state '{}'", item.getName(),
+ itemType, vo.getTableName(), item.getState().toString());
+
+ // insertItemValue
+ logger.debug("JDBC::storeItemValueProvider: getState: '{}'", item.getState().toString());
+ if ("COLORITEM".equals(itemType)) {
+ vo.setValueTypes(getSqlTypes().get(itemType), java.lang.String.class);
+ vo.setValue(item.getState().toString());
+ } else if ("NUMBERITEM".equals(itemType)) {
+ String it = getSqlTypes().get(itemType);
+ if (it.toUpperCase().contains("DOUBLE")) {
+ vo.setValueTypes(it, java.lang.Double.class);
+ Number newVal = ((DecimalType) item.getState());
+ logger.debug("JDBC::storeItemValueProvider: newVal.doubleValue: '{}'", newVal.doubleValue());
+ vo.setValue(newVal.doubleValue());
+ } else if (it.toUpperCase().contains("DECIMAL") || it.toUpperCase().contains("NUMERIC")) {
+ vo.setValueTypes(it, java.math.BigDecimal.class);
+ DecimalType newVal = ((DecimalType) item.getState());
+ logger.debug("JDBC::storeItemValueProvider: newVal.toBigDecimal: '{}'", newVal.toBigDecimal());
+ vo.setValue(newVal.toBigDecimal());
+ } else if (it.toUpperCase().contains("INT")) {
+ vo.setValueTypes(it, java.lang.Integer.class);
+ Number newVal = ((DecimalType) item.getState());
+ logger.debug("JDBC::storeItemValueProvider: newVal.intValue: '{}'", newVal.intValue());
+ vo.setValue(newVal.intValue());
+ } else {// fall back to String
+ vo.setValueTypes(it, java.lang.String.class);
+ logger.warn("JDBC::storeItemValueProvider: item.getState().toString(): '{}'",
+ item.getState().toString());
+ vo.setValue(item.getState().toString());
+ }
+ } else if ("ROLLERSHUTTERITEM".equals(itemType) || "DIMMERITEM".equals(itemType)) {
+ vo.setValueTypes(getSqlTypes().get(itemType), java.lang.Integer.class);
+ Number newVal = ((DecimalType) item.getState());
+ logger.debug("JDBC::storeItemValueProvider: newVal.intValue: '{}'", newVal.intValue());
+ vo.setValue(newVal.intValue());
+ } else if ("DATETIMEITEM".equals(itemType)) {
+ vo.setValueTypes(getSqlTypes().get(itemType), java.sql.Timestamp.class);
+ Calendar x = ((DateTimeType) item.getState()).getCalendar();
+ java.sql.Timestamp d = new java.sql.Timestamp(x.getTimeInMillis());
+ logger.debug("JDBC::storeItemValueProvider: DateTimeItem: '{}'", d);
+ vo.setValue(d);
+ } else {
+ /*
+ * !!ATTENTION!!
+ *
+ * 1. DimmerItem.getStateAs(PercentType.class).toString() always
+ * returns 0
+ * RollershutterItem.getStateAs(PercentType.class).toString() works
+ * as expected
+ *
+ * 2. (item instanceof ColorItem) == (item instanceof DimmerItem) =
+ * true Therefore for instance tests ColorItem always has to be
+ * tested before DimmerItem
+ *
+ * !!ATTENTION!!
+ */
+ // All other items should return the best format by default
+ vo.setValueTypes(getSqlTypes().get(itemType), java.lang.String.class);
+ logger.debug("JDBC::storeItemValueProvider: other: item.getState().toString(): '{}'",
+ item.getState().toString());
+ vo.setValue(item.getState().toString());
+ }
+ return vo;
+ }
+
+ /*****************
+ * H E L P E R S *
+ *****************/
+ protected State getState(Item item, Object v) {
+ String clazz = v.getClass().getSimpleName();
+ logger.debug("JDBC::ItemResultHandler::handleResult getState value = '{}', getClass = '{}', clazz = '{}'",
+ v.toString(), v.getClass(), clazz);
+ if (item instanceof NumberItem) {
+ String it = getSqlTypes().get("NUMBERITEM");
+ if (it.toUpperCase().contains("DOUBLE")) {
+ return new DecimalType(((Number) v).doubleValue());
+ } else if (it.toUpperCase().contains("DECIMAL") || it.toUpperCase().contains("NUMERIC")) {
+ return new DecimalType((BigDecimal) v);
+ } else if (it.toUpperCase().contains("INT")) {
+ return new DecimalType(((Integer) v).intValue());
+ }
+ return DecimalType.valueOf(((String) v).toString());
+
+ } else if (item instanceof ColorItem) {
+ return HSBType.valueOf(((String) v).toString());
+
+ } else if (item instanceof DimmerItem) {
+ return new PercentType(objectAsInteger(v));
+
+ } else if (item instanceof SwitchItem) {
+ return OnOffType.valueOf(((String) v).toString().trim());
+
+ } else if (item instanceof ContactItem) {
+ return OpenClosedType.valueOf(((String) v).toString().trim());
+
+ } else if (item instanceof RollershutterItem) {
+ return new PercentType(objectAsInteger(v));
+
+ } else if (item instanceof DateTimeItem) {
+ Calendar calendar = Calendar.getInstance();
+ calendar.setTimeInMillis(objectAsLong(v));
+ return new DateTimeType(calendar);
+
+ } else if (item instanceof StringItem) {
+ return StringType.valueOf(((String) v).toString());
+
+ } else {// Call, Location, String
+ return StringType.valueOf(((String) v).toString());
+
+ }
+ }
+
+ protected Date objectAsDate(Object v) {
+ if (v instanceof java.lang.String) {
+ // toInstant is Java8 only: return Date.from(Timestamp.valueOf(v.toString()).toInstant());
+ return new Date(Timestamp.valueOf(v.toString()).getTime());
+ }
+ // toInstant is Java8 only: return Date.from(((Timestamp) v).toInstant());
+ return new Date(((Timestamp) v).getTime());
+ }
+
+ protected Long objectAsLong(Object v) {
+ if (v instanceof Long) {
+ return ((Number) v).longValue();
+ } else if (v instanceof java.sql.Date) {
+ return ((java.sql.Date) v).getTime();
+ }
+ return ((java.sql.Timestamp) v).getTime();
+ }
+
+ protected Integer objectAsInteger(Object v) {
+ if (v instanceof Byte) {
+ return ((Byte) v).intValue();
+ }
+ return ((Integer) v).intValue();
+ }
+
+ public String getItemType(Item i) {
+ Item item = i;
+ String def = "STRINGITEM";
+ if (i instanceof GroupItem) {
+ item = ((GroupItem) i).getBaseItem();
+ if (item == null) {
+ // if GroupItem: is not defined in
+ // *.items using StringType
+ // logger.debug("JDBC: BaseItem GroupItem: is not
+ // defined in *.items searching for first Member and try to use
+ // as ItemType");
+ logger.debug(
+ "JDBC::getItemType: Cannot detect ItemType for {} because the GroupItems' base type isn't set in *.items File.",
+ i.getName());
+ item = ((GroupItem) i).getMembers().iterator().next();
+ if (item == null) {
+ logger.debug(
+ "JDBC::getItemType: No ItemType found for first Child-Member of GroupItem {}, use ItemType for STRINGITEM as Fallback",
+ i.getName());
+ return def;
+ }
+ }
+ }
+ String itemType = item.getClass().getSimpleName().toUpperCase();
+ logger.debug("JDBC::getItemType: Try to use ItemType {} for Item {}", itemType, i.getName());
+ if (sqlTypes.get(itemType) == null) {
+ logger.warn(
+ "JDBC::getItemType: No sqlType found for ItemType {}, use ItemType for STRINGITEM as Fallback for {}",
+ itemType, i.getName());
+ return def;
+ }
+ return itemType;
+ }
+
+ /******************************
+ * public Getters and Setters *
+ ******************************/
+ public Map getSqlTypes() {
+ return sqlTypes;
+ }
+
+ public String getDataType(Item item) {
+ return sqlTypes.get(getItemType(item));
+ }
+
+}
diff --git a/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/db/JdbcDerbyDAO.java b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/db/JdbcDerbyDAO.java
new file mode 100644
index 0000000000000..ccd595be5642f
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/db/JdbcDerbyDAO.java
@@ -0,0 +1,246 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.db;
+
+import java.time.format.DateTimeFormatter;
+import java.util.ArrayList;
+import java.util.List;
+
+import org.knowm.yank.Yank;
+import org.openhab.core.items.Item;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.FilterCriteria.Ordering;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.persistence.jdbc.model.ItemVO;
+import org.openhab.persistence.jdbc.model.ItemsVO;
+import org.openhab.persistence.jdbc.model.JdbcItem;
+import org.openhab.persistence.jdbc.utils.StringUtilsExt;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Extended Database Configuration class. Class represents
+ * the extended database-specific configuration. Overrides and supplements the
+ * default settings from JdbcBaseDAO. Enter only the differences to JdbcBaseDAO here.
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class JdbcDerbyDAO extends JdbcBaseDAO {
+ private static final Logger logger = LoggerFactory.getLogger(JdbcDerbyDAO.class);
+
+ /********
+ * INIT *
+ ********/
+ public JdbcDerbyDAO() {
+ super();
+ initSqlTypes();
+ initDbProps();
+ initSqlQueries();
+ }
+
+ private void initSqlQueries() {
+ logger.debug("JDBC::initSqlQueries: '{}'", this.getClass().getSimpleName());
+ SQL_PING_DB = "values 1";
+ SQL_GET_DB = "VALUES SYSCS_UTIL.SYSCS_GET_DATABASE_PROPERTY( 'DataDictionaryVersion' )"; // returns version
+ SQL_IF_TABLE_EXISTS = "SELECT * FROM SYS.SYSTABLES WHERE TABLENAME='#searchTable#'";
+ SQL_CREATE_ITEMS_TABLE_IF_NOT = "CREATE TABLE #itemsManageTable# ( ItemId INTEGER NOT NULL GENERATED ALWAYS AS IDENTITY (START WITH 1, INCREMENT BY 1), #colname# #coltype# NOT NULL)";
+ SQL_CREATE_ITEM_TABLE = "CREATE TABLE #tableName# (time #tablePrimaryKey# NOT NULL, value #dbType#, PRIMARY KEY(time))";
+ // Prevent error against duplicate time value (seldom): No powerful Merge found:
+ // http://www.codeproject.com/Questions/162627/how-to-insert-new-record-in-my-table-if-not-exists
+ SQL_INSERT_ITEM_VALUE = "INSERT INTO #tableName# (TIME, VALUE) VALUES( #tablePrimaryValue#, CAST( ? as #dbType#) )";
+ }
+
+ private void initSqlTypes() {
+ sqlTypes.put("DATETIMEITEM", "TIMESTAMP");
+ sqlTypes.put("DIMMERITEM", "SMALLINT");
+ sqlTypes.put("ROLLERSHUTTERITEM", "SMALLINT");
+ sqlTypes.put("STRINGITEM", "VARCHAR(32000)");
+ sqlTypes.put("tablePrimaryValue", "CURRENT_TIMESTAMP");
+ logger.debug("JDBC::initSqlTypes: Initialized the type array sqlTypes={}", sqlTypes.values());
+ }
+
+ /**
+ * INFO: https://github.com/brettwooldridge/HikariCP
+ */
+ private void initDbProps() {
+
+ // Properties for HikariCP
+ // Use driverClassName
+ databaseProps.setProperty("driverClassName", "org.apache.derby.jdbc.EmbeddedDriver");
+ // OR dataSourceClassName
+ // databaseProps.setProperty("dataSourceClassName", "org.apache.derby.jdbc.EmbeddedDataSource");
+ databaseProps.setProperty("maximumPoolSize", "1");
+ databaseProps.setProperty("minimumIdle", "1");
+
+ }
+
+ @Override
+ public void initAfterFirstDbConnection() {
+ logger.debug("JDBC::initAfterFirstDbConnection: Initializing step, after db is connected.");
+ // Initialize sqlTypes, depending on DB version for example
+ // derby does not like this... dbMeta = new DbMetaData();// get DB information
+ }
+
+ /**************
+ * ITEMS DAOs *
+ **************/
+ @Override
+ public Integer doPingDB() {
+ return Yank.queryScalar(SQL_PING_DB, Integer.class, null);
+ }
+
+ @Override
+ public boolean doIfTableExists(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(SQL_IF_TABLE_EXISTS, new String[] { "#searchTable#" },
+ new String[] { vo.getItemsManageTable().toUpperCase() });
+ logger.debug("JDBC::doIfTableExists sql={}", sql);
+ return Yank.queryScalar(sql, String.class, null) != null;
+ }
+
+ @Override
+ public Long doCreateNewEntryInItemsTable(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(SQL_CREATE_NEW_ENTRY_IN_ITEMS_TABLE,
+ new String[] { "#itemsManageTable#", "#itemname#" },
+ new String[] { vo.getItemsManageTable().toUpperCase(), vo.getItemname() });
+ logger.debug("JDBC::doCreateNewEntryInItemsTable sql={}", sql);
+ return Yank.insert(sql, null);
+ }
+
+ @Override
+ public ItemsVO doCreateItemsTableIfNot(ItemsVO vo) {
+ // boolean tableExists = Yank.queryScalar(SQL_IF_TABLE_EXISTS.replace("#searchTable#",
+ // vo.getItemsManageTable().toUpperCase()), String.class, null) == null;
+ boolean tableExists = doIfTableExists(vo);
+ if (!tableExists) {
+ String sql = StringUtilsExt.replaceArrayMerge(SQL_CREATE_ITEMS_TABLE_IF_NOT,
+ new String[] { "#itemsManageTable#", "#colname#", "#coltype#" },
+ new String[] { vo.getItemsManageTable().toUpperCase(), vo.getColname(), vo.getColtype() });
+ logger.debug("JDBC::doCreateItemsTableIfNot tableExists={} therefore sql={}", tableExists, sql);
+ Yank.execute(sql, null);
+ } else {
+ logger.debug("JDBC::doCreateItemsTableIfNot tableExists={}, did not CREATE TABLE", tableExists);
+ }
+ return vo;
+ }
+
+ /*************
+ * ITEM DAOs *
+ *************/
+ @Override
+ public void doCreateItemTable(ItemVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(SQL_CREATE_ITEM_TABLE,
+ new String[] { "#tableName#", "#dbType#", "#tablePrimaryKey#" },
+ new String[] { vo.getTableName(), vo.getDbType(), sqlTypes.get("tablePrimaryKey") });
+ Yank.execute(sql, null);
+ }
+
+ @Override
+ public void doStoreItemValue(Item item, ItemVO vo) {
+ vo = storeItemValueProvider(item, vo);
+ String sql = StringUtilsExt.replaceArrayMerge(SQL_INSERT_ITEM_VALUE,
+ new String[] { "#tableName#", "#dbType#", "#tablePrimaryValue#" },
+ new String[] { vo.getTableName().toUpperCase(), vo.getDbType(), sqlTypes.get("tablePrimaryValue") });
+ Object[] params = new Object[] { vo.getValue() };
+ logger.debug("JDBC::doStoreItemValue sql={} value='{}'", sql, vo.getValue());
+ Yank.execute(sql, params);
+ }
+
+ @Override
+ public List doGetHistItemFilterQuery(Item item, FilterCriteria filter, int numberDecimalcount,
+ String table, String name) {
+ String sql = histItemFilterQueryProvider(filter, numberDecimalcount, table, name);
+ List m = Yank.queryObjectArrays(sql, null);
+
+ logger.debug("JDBC::doGetHistItemFilterQuery got Array length={}", m.size());
+
+ List items = new ArrayList();
+ for (int i = 0; i < m.size(); i++) {
+ logger.debug("JDBC::doGetHistItemFilterQuery 0='{}' 1='{}'", m.get(i)[0], m.get(i)[1]);
+ items.add(new JdbcItem(item.getName(), getState(item, m.get(i)[1]), objectAsDate(m.get(i)[0])));
+ }
+ return items;
+ }
+
+ /****************************
+ * SQL generation Providers *
+ ****************************/
+ static final DateTimeFormatter jdbcDateFormat = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss");
+
+ /**
+ * @param filter
+ * @param numberDecimalcount
+ * @param table
+ * @return
+ */
+ private String histItemFilterQueryProvider(FilterCriteria filter, int numberDecimalcount, String table,
+ String simpleName) {
+ logger.debug(
+ "JDBC::getHistItemFilterQueryProvider filter = {}, numberDecimalcount = {}, table = {}, simpleName = {}",
+ StringUtilsExt.filterToString(filter), numberDecimalcount, table, simpleName);
+
+ String filterString = "";
+ if (filter.getBeginDate() != null) {
+ filterString += filterString.isEmpty() ? " WHERE" : " AND";
+ filterString += " TIME>'" + jdbcDateFormat.format(filter.getBeginDateZoned()) + "'";
+ }
+ if (filter.getEndDate() != null) {
+ filterString += filterString.isEmpty() ? " WHERE" : " AND";
+ filterString += " TIME<'" + jdbcDateFormat.format(filter.getEndDateZoned()) + "'";
+ }
+ filterString += (filter.getOrdering() == Ordering.ASCENDING) ? " ORDER BY time ASC" : " ORDER BY time DESC";
+ if (filter.getPageSize() != 0x7fffffff) {
+ // TODO: TESTING!!!
+ // filterString += " LIMIT " + filter.getPageNumber() *
+ // filter.getPageSize() + "," + filter.getPageSize();
+ // SELECT time, value FROM ohscriptfiles_sw_ace_paths_0001 ORDER BY
+ // time DESC OFFSET 1 ROWS FETCH NEXT 0 ROWS ONLY
+ // filterString += " OFFSET " + filter.getPageSize() +" ROWS FETCH
+ // FIRST||NEXT " + filter.getPageNumber() * filter.getPageSize() + "
+ // ROWS ONLY";
+ filterString += " OFFSET " + filter.getPageSize() + " ROWS FETCH FIRST "
+ + (filter.getPageNumber() * filter.getPageSize() + 1) + " ROWS ONLY";
+ }
+
+ // http://www.seemoredata.com/en/showthread.php?132-Round-function-in-Apache-Derby
+ // simulated round function in Derby: CAST(value 0.0005 AS DECIMAL(15,3))
+ // simulated round function in Derby: "CAST(value 0.0005 AS DECIMAL(15,"+numberDecimalcount+"))"
+
+ String queryString = "SELECT time,";
+ if ("NUMBERITEM".equalsIgnoreCase(simpleName) && numberDecimalcount > -1) {
+ // rounding HALF UP
+ queryString += "CAST(value 0.";
+ for (int i = 0; i < numberDecimalcount; i++) {
+ queryString += "0";
+ }
+ queryString += "5 AS DECIMAL(31," + numberDecimalcount + "))"; // 31 is DECIMAL max precision
+ // https://db.apache.org/derby/docs/10.0/manuals/develop/develop151.html
+ } else {
+ queryString += " value FROM " + table.toUpperCase();
+ }
+
+ if (!filterString.isEmpty()) {
+ queryString += filterString;
+ }
+ logger.debug("JDBC::query queryString = {}", queryString);
+ return queryString;
+ }
+
+ /*****************
+ * H E L P E R S *
+ *****************/
+
+ /******************************
+ * public Getters and Setters *
+ ******************************/
+
+}
diff --git a/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/db/JdbcH2DAO.java b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/db/JdbcH2DAO.java
new file mode 100644
index 0000000000000..6fcbb971f2ed3
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/db/JdbcH2DAO.java
@@ -0,0 +1,96 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.db;
+
+import org.knowm.yank.Yank;
+import org.openhab.core.items.Item;
+import org.openhab.persistence.jdbc.model.ItemVO;
+import org.openhab.persistence.jdbc.utils.StringUtilsExt;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Extended Database Configuration class. Class represents
+ * the extended database-specific configuration. Overrides and supplements the
+ * default settings from JdbcBaseDAO. Enter only the differences to JdbcBaseDAO here.
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class JdbcH2DAO extends JdbcBaseDAO {
+ private static final Logger logger = LoggerFactory.getLogger(JdbcH2DAO.class);
+
+ /********
+ * INIT *
+ ********/
+ public JdbcH2DAO() {
+ super();
+ initSqlQueries();
+ initSqlTypes();
+ initDbProps();
+ }
+
+ private void initSqlQueries() {
+ logger.debug("JDBC::initSqlQueries: '{}'", this.getClass().getSimpleName());
+ SQL_IF_TABLE_EXISTS = "SELECT * FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME='#searchTable#'";
+ // SQL_INSERT_ITEM_VALUE = "INSERT INTO #tableName# (TIME, VALUE) VALUES( NOW(), CAST( ? as #dbType#) )";
+ // http://stackoverflow.com/questions/19768051/h2-sql-database-insert-if-the-record-does-not-exist
+ SQL_INSERT_ITEM_VALUE = "MERGE INTO #tableName# (TIME, VALUE) VALUES( #tablePrimaryValue#, CAST( ? as #dbType#) )";
+ }
+
+ /**
+ * INFO: http://www.java2s.com/Code/Java/Database-SQL-JDBC/StandardSQLDataTypeswithTheirJavaEquivalents.htm
+ */
+ private void initSqlTypes() {
+ }
+
+ /**
+ * INFO: https://github.com/brettwooldridge/HikariCP
+ */
+ private void initDbProps() {
+
+ // Properties for HikariCP
+ databaseProps.setProperty("driverClassName", "org.h2.Driver");
+ // driverClassName OR BETTER USE dataSourceClassName
+ // databaseProps.setProperty("dataSourceClassName", "org.h2.jdbcx.JdbcDataSource");
+ }
+
+ /**************
+ * ITEMS DAOs *
+ **************/
+
+ /*************
+ * ITEM DAOs *
+ *************/
+ @Override
+ public void doStoreItemValue(Item item, ItemVO vo) {
+ vo = storeItemValueProvider(item, vo);
+ String sql = StringUtilsExt.replaceArrayMerge(SQL_INSERT_ITEM_VALUE,
+ new String[] { "#tableName#", "#dbType#", "#tablePrimaryValue#" },
+ new String[] { vo.getTableName(), vo.getDbType(), sqlTypes.get("tablePrimaryValue") });
+ Object[] params = new Object[] { vo.getValue() };
+ logger.debug("JDBC::doStoreItemValue sql={} value='{}'", sql, vo.getValue());
+ Yank.execute(sql, params);
+ }
+
+ /****************************
+ * SQL generation Providers *
+ ****************************/
+
+ /*****************
+ * H E L P E R S *
+ *****************/
+
+ /******************************
+ * public Getters and Setters *
+ ******************************/
+}
diff --git a/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/db/JdbcHsqldbDAO.java b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/db/JdbcHsqldbDAO.java
new file mode 100644
index 0000000000000..133fff53e1773
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/db/JdbcHsqldbDAO.java
@@ -0,0 +1,127 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.db;
+
+import org.knowm.yank.Yank;
+import org.openhab.core.items.Item;
+import org.openhab.persistence.jdbc.model.ItemVO;
+import org.openhab.persistence.jdbc.model.ItemsVO;
+import org.openhab.persistence.jdbc.utils.StringUtilsExt;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Extended Database Configuration class. Class represents
+ * the extended database-specific configuration. Overrides and supplements the
+ * default settings from JdbcBaseDAO. Enter only the differences to JdbcBaseDAO here.
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class JdbcHsqldbDAO extends JdbcBaseDAO {
+ private static final Logger logger = LoggerFactory.getLogger(JdbcHsqldbDAO.class);
+
+ /********
+ * INIT *
+ ********/
+ public JdbcHsqldbDAO() {
+ super();
+ initSqlQueries();
+ initSqlTypes();
+ initDbProps();
+ }
+
+ private void initSqlQueries() {
+ logger.debug("JDBC::initSqlQueries: '{}'", this.getClass().getSimpleName());
+ // http://hsqldb.org/doc/guide/builtinfunctions-chapt.html
+ SQL_PING_DB = "SELECT 1 FROM INFORMATION_SCHEMA.SYSTEM_USERS";
+ SQL_GET_DB = "SELECT DATABASE () FROM INFORMATION_SCHEMA.SYSTEM_USERS";
+ SQL_IF_TABLE_EXISTS = "SELECT * FROM INFORMATION_SCHEMA.SYSTEM_TABLES WHERE TABLE_NAME='#searchTable#'";
+ SQL_CREATE_ITEMS_TABLE_IF_NOT = "CREATE TABLE IF NOT EXISTS #itemsManageTable# ( ItemId INT GENERATED BY DEFAULT AS IDENTITY (START WITH 1, INCREMENT BY 1) NOT NULL, #colname# #coltype# NOT NULL)";
+ SQL_CREATE_NEW_ENTRY_IN_ITEMS_TABLE = "INSERT INTO #itemsManageTable# (ItemName) VALUES ('#itemname#')";
+ // Prevent error against duplicate time value
+ // http://hsqldb.org/doc/guide/dataaccess-chapt.html#dac_merge_statement
+ // SQL_INSERT_ITEM_VALUE = "INSERT INTO #tableName# (TIME, VALUE) VALUES( NOW(), CAST( ? as #dbType#) )";
+ SQL_INSERT_ITEM_VALUE = "MERGE INTO #tableName# "
+ + "USING (VALUES #tablePrimaryValue#, CAST( ? as #dbType#)) temp (TIME, VALUE) ON (#tableName#.TIME=temp.TIME) "
+ + "WHEN NOT MATCHED THEN INSERT (TIME, VALUE) VALUES (temp.TIME, temp.VALUE)";
+ }
+
+ /**
+ * INFO: http://www.java2s.com/Code/Java/Database-SQL-JDBC/StandardSQLDataTypeswithTheirJavaEquivalents.htm
+ */
+ private void initSqlTypes() {
+ }
+
+ /**
+ * INFO: https://github.com/brettwooldridge/HikariCP
+ */
+ private void initDbProps() {
+
+ // Properties for HikariCP
+ databaseProps.setProperty("driverClassName", "org.hsqldb.jdbcDriver");
+ }
+
+ /**************
+ * ITEMS DAOs *
+ **************/
+ @Override
+ public Integer doPingDB() {
+ return Yank.queryScalar(SQL_PING_DB, Integer.class, null);
+ }
+
+ @Override
+ public ItemsVO doCreateItemsTableIfNot(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(SQL_CREATE_ITEMS_TABLE_IF_NOT,
+ new String[] { "#itemsManageTable#", "#colname#", "#coltype#", "#itemsManageTable#" },
+ new String[] { vo.getItemsManageTable(), vo.getColname(), vo.getColtype(), vo.getItemsManageTable() });
+ logger.debug("JDBC::doCreateItemsTableIfNot sql={}", sql);
+ Yank.execute(sql, null);
+ return vo;
+ }
+
+ @Override
+ public Long doCreateNewEntryInItemsTable(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(SQL_CREATE_NEW_ENTRY_IN_ITEMS_TABLE,
+ new String[] { "#itemsManageTable#", "#itemname#" },
+ new String[] { vo.getItemsManageTable(), vo.getItemname() });
+ logger.debug("JDBC::doCreateNewEntryInItemsTable sql={}", sql);
+ return Yank.insert(sql, null);
+ }
+
+ /*************
+ * ITEM DAOs *
+ *************/
+ @Override
+ public void doStoreItemValue(Item item, ItemVO vo) {
+ vo = storeItemValueProvider(item, vo);
+ String sql = StringUtilsExt.replaceArrayMerge(SQL_INSERT_ITEM_VALUE,
+ new String[] { "#tableName#", "#dbType#", "#tableName#", "#tablePrimaryValue#" }, new String[] {
+ vo.getTableName(), vo.getDbType(), vo.getTableName(), sqlTypes.get("tablePrimaryValue") });
+ Object[] params = new Object[] { vo.getValue() };
+ logger.debug("JDBC::doStoreItemValue sql={} value='{}'", sql, vo.getValue());
+ Yank.execute(sql, params);
+ }
+
+ /****************************
+ * SQL generation Providers *
+ ****************************/
+
+ /*****************
+ * H E L P E R S *
+ *****************/
+
+ /******************************
+ * public Getters and Setters *
+ ******************************/
+
+}
diff --git a/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/db/JdbcMariadbDAO.java b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/db/JdbcMariadbDAO.java
new file mode 100644
index 0000000000000..e9754853fad55
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/db/JdbcMariadbDAO.java
@@ -0,0 +1,108 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.db;
+
+import org.knowm.yank.Yank;
+import org.openhab.persistence.jdbc.utils.DbMetaData;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Extended Database Configuration class. Class represents
+ * the extended database-specific configuration. Overrides and supplements the
+ * default settings from JdbcBaseDAO. Enter only the differences to JdbcBaseDAO here.
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class JdbcMariadbDAO extends JdbcBaseDAO {
+ private static final Logger logger = LoggerFactory.getLogger(JdbcMariadbDAO.class);
+
+ /********
+ * INIT *
+ ********/
+ public JdbcMariadbDAO() {
+ super();
+ initSqlTypes();
+ initDbProps();
+ initSqlQueries();
+ }
+
+ private void initSqlQueries() {
+ logger.debug("JDBC::initSqlQueries: '{}'", this.getClass().getSimpleName());
+ }
+
+ /**
+ * INFO: http://www.java2s.com/Code/Java/Database-SQL-JDBC/StandardSQLDataTypeswithTheirJavaEquivalents.htm
+ */
+ private void initSqlTypes() {
+ logger.debug("JDBC::initSqlTypes: Initialize the type array");
+ }
+
+ /**
+ * INFO: https://github.com/brettwooldridge/HikariCP
+ */
+ private void initDbProps() {
+
+ // Performancetuning
+ databaseProps.setProperty("dataSource.cachePrepStmts", "true");
+ databaseProps.setProperty("dataSource.prepStmtCacheSize", "250");
+ databaseProps.setProperty("dataSource.prepStmtCacheSqlLimit", "2048");
+ databaseProps.setProperty("dataSource.jdbcCompliantTruncation", "false");// jdbc standard max varchar max length
+ // of 21845
+
+ // Properties for HikariCP
+ // Use driverClassName
+ databaseProps.setProperty("driverClassName", "org.mariadb.jdbc.Driver");
+ // driverClassName OR BETTER USE dataSourceClassName
+ // databaseProps.setProperty("dataSourceClassName", "org.mariadb.jdbc.MySQLDataSource");
+ databaseProps.setProperty("maximumPoolSize", "3");
+ databaseProps.setProperty("minimumIdle", "2");
+ }
+
+ @Override
+ public void initAfterFirstDbConnection() {
+ logger.debug("JDBC::initAfterFirstDbConnection: Initializing step, after db is connected.");
+ dbMeta = new DbMetaData();
+ // Initialize sqlTypes, depending on DB version for example
+ if (dbMeta.isDbVersionGreater(5, 1)) {
+ sqlTypes.put("DATETIMEITEM", "TIMESTAMP(3)");
+ sqlTypes.put("tablePrimaryKey", "TIMESTAMP(3)");
+ sqlTypes.put("tablePrimaryValue", "NOW(3)");
+ }
+ }
+
+ /**************
+ * ITEMS DAOs *
+ **************/
+ @Override
+ public Integer doPingDB() {
+ return Yank.queryScalar(SQL_PING_DB, Long.class, null).intValue();
+ }
+
+ /*************
+ * ITEM DAOs *
+ *************/
+
+ /****************************
+ * SQL generation Providers *
+ ****************************/
+
+ /*****************
+ * H E L P E R S *
+ *****************/
+
+ /******************************
+ * public Getters and Setters *
+ ******************************/
+
+}
diff --git a/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/db/JdbcMysqlDAO.java b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/db/JdbcMysqlDAO.java
new file mode 100644
index 0000000000000..fd256e49cc050
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/db/JdbcMysqlDAO.java
@@ -0,0 +1,113 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.db;
+
+import org.knowm.yank.Yank;
+import org.openhab.persistence.jdbc.utils.DbMetaData;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Extended Database Configuration class. Class represents
+ * the extended database-specific configuration. Overrides and supplements the
+ * default settings from JdbcBaseDAO. Enter only the differences to JdbcBaseDAO here.
+ *
+ * since driver version >= 6.0 sometimes timezone conversation is needed: ?serverTimezone=UTC
+ * example: dbProps.setProperty("jdbcUrl", "jdbc:mysql://192.168.0.181:3306/ItemTypeTest3?serverTimezone=UTC");//mysql
+ * 5.7
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class JdbcMysqlDAO extends JdbcBaseDAO {
+ private final Logger logger = LoggerFactory.getLogger(JdbcMysqlDAO.class);
+
+ /********
+ * INIT *
+ ********/
+ public JdbcMysqlDAO() {
+ super();
+ initSqlTypes();
+ initDbProps();
+ initSqlQueries();
+ }
+
+ private void initSqlQueries() {
+ logger.debug("JDBC::initSqlQueries: '{}'", this.getClass().getSimpleName());
+ }
+
+ /**
+ * INFO: http://www.java2s.com/Code/Java/Database-SQL-JDBC/StandardSQLDataTypeswithTheirJavaEquivalents.htm
+ */
+ private void initSqlTypes() {
+ logger.debug("JDBC::initSqlTypes: Initialize the type array");
+ sqlTypes.put("STRINGITEM", "VARCHAR(21717)");// mysql using utf-8 max 65535/3 = 21845, using 21845-128 = 21717
+ }
+
+ /**
+ * INFO: https://github.com/brettwooldridge/HikariCP
+ */
+ private void initDbProps() {
+
+ // Performancetuning
+ databaseProps.setProperty("dataSource.cachePrepStmts", "true");
+ databaseProps.setProperty("dataSource.prepStmtCacheSize", "250");
+ databaseProps.setProperty("dataSource.prepStmtCacheSqlLimit", "2048");
+ databaseProps.setProperty("dataSource.jdbcCompliantTruncation", "false");// jdbc standard max varchar max length
+ // of 21845
+
+ // Properties for HikariCP
+ // Use driverClassName
+ databaseProps.setProperty("driverClassName", "com.mysql.jdbc.Driver");
+ // OR dataSourceClassName
+ // databaseProps.setProperty("dataSourceClassName", "com.mysql.jdbc.jdbc2.optional.MysqlDataSource");
+ databaseProps.setProperty("maximumPoolSize", "3");
+ databaseProps.setProperty("minimumIdle", "2");
+ }
+
+ @Override
+ public void initAfterFirstDbConnection() {
+ logger.debug("JDBC::initAfterFirstDbConnection: Initializing step, after db is connected.");
+ dbMeta = new DbMetaData();
+ // Initialize sqlTypes, depending on DB version for example
+ if (dbMeta.isDbVersionGreater(5, 5)) {
+ sqlTypes.put("DATETIMEITEM", "TIMESTAMP(3)");
+ sqlTypes.put("tablePrimaryKey", "TIMESTAMP(3)");
+ sqlTypes.put("tablePrimaryValue", "NOW(3)");
+ }
+ }
+
+ /**************
+ * ITEMS DAOs *
+ **************/
+ @Override
+ public Integer doPingDB() {
+ return Yank.queryScalar(SQL_PING_DB, Long.class, null).intValue();
+ }
+
+ /*************
+ * ITEM DAOs *
+ *************/
+
+ /****************************
+ * SQL generation Providers *
+ ****************************/
+
+ /*****************
+ * H E L P E R S *
+ *****************/
+
+ /******************************
+ * public Getters and Setters *
+ ******************************/
+
+}
diff --git a/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/db/JdbcPostgresqlDAO.java b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/db/JdbcPostgresqlDAO.java
new file mode 100644
index 0000000000000..b7119603cfbaa
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/db/JdbcPostgresqlDAO.java
@@ -0,0 +1,211 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.db;
+
+import java.time.format.DateTimeFormatter;
+import java.util.ArrayList;
+import java.util.List;
+
+import org.knowm.yank.Yank;
+import org.openhab.core.items.Item;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.FilterCriteria.Ordering;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.persistence.jdbc.model.ItemVO;
+import org.openhab.persistence.jdbc.model.ItemsVO;
+import org.openhab.persistence.jdbc.model.JdbcItem;
+import org.openhab.persistence.jdbc.utils.StringUtilsExt;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Extended Database Configuration class. Class represents
+ * the extended database-specific configuration. Overrides and supplements the
+ * default settings from JdbcBaseDAO. Enter only the differences to JdbcBaseDAO here.
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class JdbcPostgresqlDAO extends JdbcBaseDAO {
+ private final Logger logger = LoggerFactory.getLogger(JdbcPostgresqlDAO.class);
+
+ /********
+ * INIT *
+ ********/
+ public JdbcPostgresqlDAO() {
+ super();
+ initSqlQueries();
+ initSqlTypes();
+ initDbProps();
+ }
+
+ private void initSqlQueries() {
+ logger.debug("JDBC::initSqlQueries: '{}'", this.getClass().getSimpleName());
+ // System Information Functions: https://www.postgresql.org/docs/9.2/static/functions-info.html
+ SQL_GET_DB = "SELECT CURRENT_DATABASE()";
+ SQL_IF_TABLE_EXISTS = "SELECT * FROM PG_TABLES WHERE TABLENAME='#searchTable#'";
+ SQL_CREATE_ITEMS_TABLE_IF_NOT = "CREATE TABLE IF NOT EXISTS #itemsManageTable# (itemid SERIAL NOT NULL, #colname# #coltype# NOT NULL, CONSTRAINT #itemsManageTable#_pkey PRIMARY KEY (itemid))";
+ SQL_CREATE_NEW_ENTRY_IN_ITEMS_TABLE = "INSERT INTO items (itemname) SELECT itemname FROM #itemsManageTable# UNION VALUES ('#itemname#') EXCEPT SELECT itemname FROM items";
+ SQL_GET_ITEM_TABLES = "SELECT table_name FROM information_schema.tables WHERE table_type='BASE TABLE' AND table_schema='public' AND NOT table_name='#itemsManageTable#'";
+ // http://stackoverflow.com/questions/17267417/how-do-i-do-an-upsert-merge-insert-on-duplicate-update-in-postgresql
+ // for later use, PostgreSql > 9.5 to prevent PRIMARY key violation use:
+ // SQL_INSERT_ITEM_VALUE = "INSERT INTO #tableName# (TIME, VALUE) VALUES( NOW(), CAST( ? as #dbType#) ) ON
+ // CONFLICT DO NOTHING";
+ SQL_INSERT_ITEM_VALUE = "INSERT INTO #tableName# (TIME, VALUE) VALUES( #tablePrimaryValue#, CAST( ? as #dbType#) )";
+ }
+
+ /**
+ * INFO: http://www.java2s.com/Code/Java/Database-SQL-JDBC/StandardSQLDataTypeswithTheirJavaEquivalents.htm
+ */
+ private void initSqlTypes() {
+ // Initialize the type array
+ sqlTypes.put("CALLITEM", "VARCHAR");
+ sqlTypes.put("COLORITEM", "VARCHAR");
+ sqlTypes.put("CONTACTITEM", "VARCHAR");
+ sqlTypes.put("DATETIMEITEM", "TIMESTAMP");
+ sqlTypes.put("DIMMERITEM", "SMALLINT");
+ sqlTypes.put("LOCATIONITEM", "VARCHAR");
+ sqlTypes.put("NUMBERITEM", "DOUBLE PRECISION");
+ sqlTypes.put("ROLLERSHUTTERITEM", "SMALLINT");
+ sqlTypes.put("STRINGITEM", "VARCHAR");
+ sqlTypes.put("SWITCHITEM", "VARCHAR");
+ logger.debug("JDBC::initSqlTypes: Initialized the type array sqlTypes={}", sqlTypes.values());
+ }
+
+ /**
+ * INFO: https://github.com/brettwooldridge/HikariCP
+ */
+ private void initDbProps() {
+
+ // Performance:
+ // databaseProps.setProperty("dataSource.cachePrepStmts", "true");
+ // databaseProps.setProperty("dataSource.prepStmtCacheSize", "250");
+ // databaseProps.setProperty("dataSource.prepStmtCacheSqlLimit", "2048");
+
+ // Properties for HikariCP
+ databaseProps.setProperty("driverClassName", "org.postgresql.Driver");
+ // driverClassName OR BETTER USE dataSourceClassName
+ // databaseProps.setProperty("dataSourceClassName", "org.postgresql.ds.PGSimpleDataSource");
+ // databaseProps.setProperty("maximumPoolSize", "3");
+ // databaseProps.setProperty("minimumIdle", "2");
+ }
+
+ /**************
+ * ITEMS DAOs *
+ **************/
+ @Override
+ public ItemsVO doCreateItemsTableIfNot(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(SQL_CREATE_ITEMS_TABLE_IF_NOT,
+ new String[] { "#itemsManageTable#", "#colname#", "#coltype#", "#itemsManageTable#" },
+ new String[] { vo.getItemsManageTable(), vo.getColname(), vo.getColtype(), vo.getItemsManageTable() });
+ logger.debug("JDBC::doCreateItemsTableIfNot sql={}", sql);
+ Yank.execute(sql, null);
+ return vo;
+ }
+
+ @Override
+ public Long doCreateNewEntryInItemsTable(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(SQL_CREATE_NEW_ENTRY_IN_ITEMS_TABLE,
+ new String[] { "#itemsManageTable#", "#itemname#" },
+ new String[] { vo.getItemsManageTable(), vo.getItemname() });
+ logger.debug("JDBC::doCreateNewEntryInItemsTable sql={}", sql);
+ return Yank.insert(sql, null);
+ }
+
+ @Override
+ public List doGetItemTables(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(SQL_GET_ITEM_TABLES, new String[] { "#itemsManageTable#" },
+ new String[] { vo.getItemsManageTable() });
+ logger.debug("JDBC::doGetItemTables sql={}", sql);
+ return Yank.queryBeanList(sql, ItemsVO.class, null);
+ }
+
+ /*************
+ * ITEM DAOs *
+ *************/
+ @Override
+ public void doStoreItemValue(Item item, ItemVO vo) {
+ vo = storeItemValueProvider(item, vo);
+ String sql = StringUtilsExt.replaceArrayMerge(SQL_INSERT_ITEM_VALUE,
+ new String[] { "#tableName#", "#dbType#", "#tablePrimaryValue#" },
+ new String[] { vo.getTableName(), vo.getDbType(), sqlTypes.get("tablePrimaryValue") });
+ Object[] params = new Object[] { vo.getValue() };
+ logger.debug("JDBC::doStoreItemValue sql={} value='{}'", sql, vo.getValue());
+ Yank.execute(sql, params);
+ }
+
+ @Override
+ public List doGetHistItemFilterQuery(Item item, FilterCriteria filter, int numberDecimalcount,
+ String table, String name) {
+ String sql = histItemFilterQueryProvider(filter, numberDecimalcount, table, name);
+ logger.debug("JDBC::doGetHistItemFilterQuery sql={}", sql);
+ List m = Yank.queryObjectArrays(sql, null);
+
+ List items = new ArrayList();
+ for (int i = 0; i < m.size(); i++) {
+ items.add(new JdbcItem(item.getName(), getState(item, m.get(i)[1]), objectAsDate(m.get(i)[0])));
+ }
+ return items;
+ }
+
+ /****************************
+ * SQL generation Providers *
+ ****************************/
+ static final DateTimeFormatter jdbcDateFormat = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss");
+
+ /**
+ * @param filter
+ * @param numberDecimalcount
+ * @param table
+ * @return
+ */
+ private String histItemFilterQueryProvider(FilterCriteria filter, int numberDecimalcount, String table,
+ String simpleName) {
+ logger.debug(
+ "JDBC::getHistItemFilterQueryProvider filter = {}, numberDecimalcount = {}, table = {}, simpleName = {}",
+ filter.toString(), numberDecimalcount, table, simpleName);
+
+ String filterString = "";
+ if (filter.getBeginDate() != null) {
+ filterString += filterString.isEmpty() ? " WHERE" : " AND";
+ filterString += " TIME>'" + jdbcDateFormat.format(filter.getBeginDateZoned()) + "'";
+ }
+ if (filter.getEndDate() != null) {
+ filterString += filterString.isEmpty() ? " WHERE" : " AND";
+ filterString += " TIME<'" + jdbcDateFormat.format(filter.getEndDateZoned()) + "'";
+ }
+ filterString += (filter.getOrdering() == Ordering.ASCENDING) ? " ORDER BY time ASC" : " ORDER BY time DESC";
+ if (filter.getPageSize() != 0x7fffffff) {
+ // see:
+ // http://www.jooq.org/doc/3.5/manual/sql-building/sql-statements/select-statement/limit-clause/
+ filterString += " OFFSET " + filter.getPageNumber() * filter.getPageSize() + " LIMIT "
+ + filter.getPageSize();
+ }
+ String queryString = "NUMBERITEM".equalsIgnoreCase(simpleName) && numberDecimalcount > -1
+ ? "SELECT time, ROUND(CAST (value AS numeric)," + numberDecimalcount + ") FROM " + table
+ : "SELECT time, value FROM " + table;
+ if (!filterString.isEmpty()) {
+ queryString += filterString;
+ }
+ logger.debug("JDBC::query queryString = {}", queryString);
+ return queryString;
+ }
+
+ /*****************
+ * H E L P E R S *
+ *****************/
+
+ /******************************
+ * public Getters and Setters *
+ ******************************/
+
+}
diff --git a/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/db/JdbcSqliteDAO.java b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/db/JdbcSqliteDAO.java
new file mode 100644
index 0000000000000..bda86ea2bc6ec
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/db/JdbcSqliteDAO.java
@@ -0,0 +1,115 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.db;
+
+import org.knowm.yank.Yank;
+import org.openhab.core.items.Item;
+import org.openhab.persistence.jdbc.model.ItemVO;
+import org.openhab.persistence.jdbc.model.ItemsVO;
+import org.openhab.persistence.jdbc.utils.StringUtilsExt;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Extended Database Configuration class. Class represents
+ * the extended database-specific configuration. Overrides and supplements the
+ * default settings from JdbcBaseDAO. Enter only the differences to JdbcBaseDAO here.
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class JdbcSqliteDAO extends JdbcBaseDAO {
+ private static final Logger logger = LoggerFactory.getLogger(JdbcSqliteDAO.class);
+
+ /********
+ * INIT *
+ ********/
+ public JdbcSqliteDAO() {
+ super();
+ initSqlQueries();
+ initSqlTypes();
+ initDbProps();
+ }
+
+ private void initSqlQueries() {
+ logger.debug("JDBC::initSqlQueries: '{}'", this.getClass().getSimpleName());
+ SQL_GET_DB = "PRAGMA DATABASE_LIST"; // "SELECT SQLITE_VERSION()"; // "PRAGMA DATABASE_LIST"->db Path/Name
+ // "PRAGMA SCHEMA_VERSION";
+ SQL_IF_TABLE_EXISTS = "SELECT name FROM sqlite_master WHERE type='table' AND name='#searchTable#'";
+ SQL_CREATE_ITEMS_TABLE_IF_NOT = "CREATE TABLE IF NOT EXISTS #itemsManageTable# (ItemId INTEGER PRIMARY KEY AUTOINCREMENT, #colname# #coltype# NOT NULL)";
+ SQL_INSERT_ITEM_VALUE = "INSERT OR IGNORE INTO #tableName# (TIME, VALUE) VALUES( #tablePrimaryValue#, CAST( ? as #dbType#) )";
+ }
+
+ /**
+ * INFO: http://www.java2s.com/Code/Java/Database-SQL-JDBC/StandardSQLDataTypeswithTheirJavaEquivalents.htm
+ */
+ private void initSqlTypes() {
+ logger.debug("JDBC::initSqlTypes: Initialize the type array");
+ sqlTypes.put("tablePrimaryValue", "strftime('%Y-%m-%d %H:%M:%f' , 'now' , 'localtime')");
+ }
+
+ /**
+ * INFO: https://github.com/brettwooldridge/HikariCP
+ */
+ private void initDbProps() {
+
+ // Properties for HikariCP
+ databaseProps.setProperty("driverClassName", "org.sqlite.JDBC");
+ // driverClassName OR BETTER USE dataSourceClassName
+ // databaseProps.setProperty("dataSourceClassName", "org.sqlite.SQLiteDataSource");
+ }
+
+ /**************
+ * ITEMS DAOs *
+ **************/
+
+ @Override
+ public String doGetDB() {
+ return Yank.queryColumn(SQL_GET_DB, "file", String.class, null).get(0);
+ }
+
+ @Override
+ public ItemsVO doCreateItemsTableIfNot(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(SQL_CREATE_ITEMS_TABLE_IF_NOT,
+ new String[] { "#itemsManageTable#", "#colname#", "#coltype#" },
+ new String[] { vo.getItemsManageTable(), vo.getColname(), vo.getColtype() });
+ logger.debug("JDBC::doCreateItemsTableIfNot sql={}", sql);
+ Yank.execute(sql, null);
+ return vo;
+ }
+
+ /*************
+ * ITEM DAOs *
+ *************/
+ @Override
+ public void doStoreItemValue(Item item, ItemVO vo) {
+ vo = storeItemValueProvider(item, vo);
+ String sql = StringUtilsExt.replaceArrayMerge(SQL_INSERT_ITEM_VALUE,
+ new String[] { "#tableName#", "#dbType#", "#tablePrimaryValue#" },
+ new String[] { vo.getTableName(), vo.getDbType(), sqlTypes.get("tablePrimaryValue") });
+ Object[] params = new Object[] { vo.getValue() };
+ logger.debug("JDBC::doStoreItemValue sql={} value='{}'", sql, vo.getValue());
+ Yank.execute(sql, params);
+ }
+
+ /****************************
+ * SQL generation Providers *
+ ****************************/
+
+ /*****************
+ * H E L P E R S *
+ *****************/
+
+ /******************************
+ * public Getters and Setters *
+ ******************************/
+}
diff --git a/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/internal/JdbcConfiguration.java b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/internal/JdbcConfiguration.java
new file mode 100644
index 0000000000000..eefe0de2dd052
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/internal/JdbcConfiguration.java
@@ -0,0 +1,385 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.internal;
+
+import java.util.Collections;
+import java.util.Enumeration;
+import java.util.Map;
+import java.util.Properties;
+import java.util.Set;
+import java.util.regex.Matcher;
+import java.util.regex.Pattern;
+
+import org.apache.commons.lang.StringUtils;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.persistence.jdbc.db.JdbcBaseDAO;
+import org.openhab.persistence.jdbc.utils.MovingAverage;
+import org.openhab.persistence.jdbc.utils.StringUtilsExt;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Configuration class
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class JdbcConfiguration {
+ private static final Logger logger = LoggerFactory.getLogger(JdbcConfiguration.class);
+
+ private static final Pattern EXTRACT_CONFIG_PATTERN = Pattern.compile("^(.*?)\\.([0-9.a-zA-Z]+)$");
+ private static final String DB_DAO_PACKAGE = "org.openhab.persistence.jdbc.db.Jdbc";
+
+ private Map configuration;
+
+ private JdbcBaseDAO dBDAO = null;
+ private String dbName = null;
+ boolean dbConnected = false;
+ boolean driverAvailable = false;
+
+ private String serviceName;
+ private String name = "jdbc";
+ public final boolean valid;
+
+ // private String url;
+ // private String user;
+ // private String password;
+ private int numberDecimalcount = 3;
+ private boolean tableUseRealItemNames = false;
+ private String tableNamePrefix = "item";
+ private int tableIdDigitCount = 4;
+ private boolean rebuildTableNames = false;
+
+ private int errReconnectThreshold = 0;
+
+ public int timerCount = 0;
+ public int time1000Statements = 0;
+ public long timer1000 = 0;
+ public MovingAverage timeAverage50arr = new MovingAverage(50);
+ public MovingAverage timeAverage100arr = new MovingAverage(100);
+ public MovingAverage timeAverage200arr = new MovingAverage(200);
+ public boolean enableLogTime = false;
+
+ public JdbcConfiguration(Map configuration) {
+ logger.debug("JDBC::JdbcConfiguration");
+ valid = updateConfig(configuration);
+ }
+
+ private boolean updateConfig(Map config) {
+ configuration = config;
+
+ logger.debug("JDBC::updateConfig: configuration size = {}", configuration.size());
+
+ String user = (String) configuration.get("user");
+ String password = (String) configuration.get("password");
+
+ // mandatory url
+ String url = (String) configuration.get("url");
+
+ if (url == null) {
+ logger.error("Mandatory url parameter is missing in configuration!");
+ return false;
+ }
+
+ Properties parsedURL = StringUtilsExt.parseJdbcURL(url);
+
+ if (StringUtils.isBlank(user)) {
+ logger.debug("No jdbc:user parameter defined in jdbc.cfg");
+ }
+ if (StringUtils.isBlank(password)) {
+ logger.debug("No jdbc:password parameter defined in jdbc.cfg.");
+ }
+
+ if (StringUtils.isBlank(url)) {
+ logger.debug(
+ "JDBC url is missing - please configure in jdbc.cfg like 'jdbc::[:;]'");
+ return false;
+ }
+
+ if ("false".equalsIgnoreCase(parsedURL.getProperty("parseValid"))) {
+ Enumeration> en = parsedURL.propertyNames();
+ String enstr = "";
+ for (Object key : Collections.list(en)) {
+ enstr += key + " = " + parsedURL.getProperty("" + key) + "\n";
+ }
+ logger.warn(
+ "JDBC url is not well formatted: {}\nPlease configure in openhab.cfg like 'jdbc::[:;]'",
+ enstr);
+ return false;
+ }
+
+ logger.debug("JDBC::updateConfig: user={}", user);
+ logger.debug("JDBC::updateConfig: password exists? {}", password != null & !StringUtils.isBlank(password));
+ logger.debug("JDBC::updateConfig: url={}", url);
+
+ // set database type and database type class
+ setDBDAOClass(parsedURL.getProperty("dbShortcut")); // derby, h2, hsqldb, mariadb, mysql, postgresql,
+ // sqlite
+ // set user
+ if (StringUtils.isNotBlank(user)) {
+ dBDAO.databaseProps.setProperty("dataSource.user", user);
+ }
+
+ // set password
+ if (StringUtils.isNotBlank(password)) {
+ dBDAO.databaseProps.setProperty("dataSource.password", password);
+ }
+
+ // set sql-types from external config
+ setSqlTypes();
+
+ String et = (String) configuration.get("reconnectCnt");
+ if (StringUtils.isNotBlank(et) && StringUtils.isNumeric(et)) {
+ errReconnectThreshold = Integer.parseInt(et);
+ logger.debug("JDBC::updateConfig: errReconnectThreshold={}", errReconnectThreshold);
+ }
+
+ String np = (String) configuration.get("tableNamePrefix");
+ if (StringUtils.isNotBlank(np)) {
+ tableNamePrefix = np;
+ logger.debug("JDBC::updateConfig: tableNamePrefix={}", tableNamePrefix);
+ }
+
+ String dd = (String) configuration.get("numberDecimalcount");
+ if (StringUtils.isNotBlank(dd) && StringUtils.isNumeric(dd)) {
+ numberDecimalcount = Integer.parseInt(dd);
+ logger.debug("JDBC::updateConfig: numberDecimalcount={}", numberDecimalcount);
+ }
+
+ String rn = (String) configuration.get("tableUseRealItemNames");
+ if (StringUtils.isNotBlank(rn)) {
+ tableUseRealItemNames = "true".equals(rn) ? Boolean.parseBoolean(rn) : false;
+ logger.debug("JDBC::updateConfig: tableUseRealItemNames={}", tableUseRealItemNames);
+ }
+
+ String td = (String) configuration.get("tableIdDigitCount");
+ if (StringUtils.isNotBlank(td) && StringUtils.isNumeric(td)) {
+ tableIdDigitCount = Integer.parseInt(td);
+ logger.debug("JDBC::updateConfig: tableIdDigitCount={}", tableIdDigitCount);
+ }
+
+ String rt = (String) configuration.get("rebuildTableNames");
+ if (StringUtils.isNotBlank(rt)) {
+ rebuildTableNames = Boolean.parseBoolean(rt);
+ logger.debug("JDBC::updateConfig: rebuildTableNames={}", rebuildTableNames);
+ }
+
+ // undocumented
+ String ac = (String) configuration.get("maximumPoolSize");
+ if (StringUtils.isNotBlank(ac)) {
+ dBDAO.databaseProps.setProperty("maximumPoolSize", ac);
+ }
+
+ // undocumented
+ String ic = (String) configuration.get("minimumIdle");
+ if (StringUtils.isNotBlank(ic)) {
+ dBDAO.databaseProps.setProperty("minimumIdle", ic);
+ }
+
+ // undocumented
+ String it = (String) configuration.get("idleTimeout");
+ if (StringUtils.isNotBlank(it)) {
+ dBDAO.databaseProps.setProperty("idleTimeout", it);
+ }
+ // undocumented
+ String ent = (String) configuration.get("enableLogTime");
+ if (StringUtils.isNotBlank(ent)) {
+ enableLogTime = "true".equals(ent) ? Boolean.parseBoolean(ent) : false;
+ }
+ logger.debug("JDBC::updateConfig: enableLogTime {}", enableLogTime);
+
+ // undocumented
+ String fd = (String) configuration.get("driverClassName");
+ if (StringUtils.isNotBlank(fd)) {
+ dBDAO.databaseProps.setProperty("driverClassName", fd);
+ }
+
+ // undocumented
+ String ds = (String) configuration.get("dataSourceClassName");
+ if (StringUtils.isNotBlank(ds)) {
+ dBDAO.databaseProps.setProperty("dataSourceClassName", ds);
+ }
+
+ // undocumented
+ String dn = dBDAO.databaseProps.getProperty("driverClassName");
+ if (dn == null) {
+ dn = dBDAO.databaseProps.getProperty("dataSourceClassName");
+ } else {
+ dBDAO.databaseProps.setProperty("jdbcUrl", url);
+ }
+
+ // test if JDBC driver bundle is available
+ testJDBCDriver(dn);
+
+ logger.debug("JDBC::updateConfig: configuration complete. service={}", getName());
+
+ return true;
+ }
+
+ private void setDBDAOClass(String sn) {
+
+ serviceName = "none";
+
+ // set database type
+ if (StringUtils.isBlank(sn) || sn.length() < 2) {
+ logger.error(
+ "JDBC::updateConfig: Required database url like 'jdbc::[:;]' - please configure the jdbc:url parameter in openhab.cfg");
+ } else {
+ serviceName = sn;
+ }
+ logger.debug("JDBC::updateConfig: found serviceName = '{}'", serviceName);
+
+ // set class for database type
+ String ddp = DB_DAO_PACKAGE + serviceName.toUpperCase().charAt(0) + serviceName.toLowerCase().substring(1)
+ + "DAO";
+
+ logger.debug("JDBC::updateConfig: Init Data Access Object Class: '{}'", ddp);
+ try {
+ dBDAO = (JdbcBaseDAO) Class.forName(ddp).newInstance();
+ logger.debug("JDBC::updateConfig: dBDAO ClassName={}", dBDAO.getClass().getName());
+ } catch (InstantiationException e) {
+ logger.error("JDBC::updateConfig: InstantiationException: {}", e.getMessage());
+ } catch (IllegalAccessException e) {
+ logger.error("JDBC::updateConfig: IllegalAccessException: {}", e.getMessage());
+ } catch (ClassNotFoundException e) {
+ logger.warn("JDBC::updateConfig: no Configuration for serviceName '{}' found. ClassNotFoundException: {}",
+ serviceName, e.getMessage());
+ logger.debug("JDBC::updateConfig: using default Database Configuration: JdbcBaseDAO !!");
+ dBDAO = new JdbcBaseDAO();
+ logger.debug("JDBC::updateConfig: dBConfig done");
+ }
+ }
+
+ private void setSqlTypes() {
+ Set keys = configuration.keySet();
+
+ for (Object k : keys) {
+ String key = (String) k;
+ Matcher matcher = EXTRACT_CONFIG_PATTERN.matcher(key);
+ if (!matcher.matches()) {
+ continue;
+ }
+ matcher.reset();
+ matcher.find();
+ if (!matcher.group(1).equals("sqltype")) {
+ continue;
+ }
+ String itemType = matcher.group(2);
+ if (!itemType.startsWith("table")) {
+ itemType = itemType.toUpperCase() + "ITEM";
+ }
+ String value = (String) configuration.get(key);
+ logger.debug("JDBC::updateConfig: set sqlTypes: itemType={} value={}", itemType, value);
+ dBDAO.sqlTypes.put(itemType, value);
+ }
+ }
+
+ private void testJDBCDriver(String driver) {
+ driverAvailable = true;
+ try {
+ Class.forName(driver);
+ logger.debug("JDBC::updateConfig: load JDBC-driverClass was successful: '{}'", driver);
+ } catch (ClassNotFoundException e) {
+ driverAvailable = false;
+ logger.error(
+ "JDBC::updateConfig: could NOT load JDBC-driverClassName or JDBC-dataSourceClassName. ClassNotFoundException: '{}'",
+ e.getMessage());
+ String warn = ""
+ + "\n\n\t!!!\n\tTo avoid this error, place an appropriate JDBC driver file for serviceName '{}' in addons directory.\n"
+ + "\tCopy missing JDBC-Driver-jar to your OpenHab/addons Folder.\n\t!!!\n" + "\tDOWNLOAD: \n";
+ if (serviceName.equals("derby")) {
+ warn += "\tDerby: version >= 10.11.1.1 from http://mvnrepository.com/artifact/org.apache.derby/derby\n";
+ } else if (serviceName.equals("h2")) {
+ warn += "\tH2: version >= 1.4.189 from http://mvnrepository.com/artifact/com.h2database/h2\n";
+ } else if (serviceName.equals("hsqldb")) {
+ warn += "\tHSQLDB: version >= 2.3.3 from http://mvnrepository.com/artifact/org.hsqldb/hsqldb\n";
+ } else if (serviceName.equals("mariadb")) {
+ warn += "\tMariaDB: version >= 1.2.0 from http://mvnrepository.com/artifact/org.mariadb.jdbc/mariadb-java-client\n";
+ } else if (serviceName.equals("mysql")) {
+ warn += "\tMySQL: version >= 5.1.36 from http://mvnrepository.com/artifact/mysql/mysql-connector-java\n";
+ } else if (serviceName.equals("postgresql")) {
+ warn += "\tPostgreSQL:version >= 9.4.1208 from http://mvnrepository.com/artifact/org.postgresql/postgresql\n";
+ } else if (serviceName.equals("sqlite")) {
+ warn += "\tSQLite: version >= 3.16.1 from http://mvnrepository.com/artifact/org.xerial/sqlite-jdbc\n";
+ }
+ logger.warn(warn, serviceName);
+ }
+ }
+
+ public Properties getHikariConfiguration() {
+ return dBDAO.databaseProps;
+ }
+
+ public String getName() {
+ // return serviceName;
+ return name;
+ }
+
+ public String getServiceName() {
+ return serviceName;
+ }
+
+ public String getTableNamePrefix() {
+ return tableNamePrefix;
+ }
+
+ public int getErrReconnectThreshold() {
+ return errReconnectThreshold;
+ }
+
+ public boolean getRebuildTableNames() {
+ return rebuildTableNames;
+ }
+
+ public int getNumberDecimalcount() {
+ return numberDecimalcount;
+ }
+
+ public boolean getTableUseRealItemNames() {
+ return tableUseRealItemNames;
+ }
+
+ public int getTableIdDigitCount() {
+ return tableIdDigitCount;
+ }
+
+ public JdbcBaseDAO getDBDAO() {
+ return dBDAO;
+ }
+
+ public String getDbName() {
+ return dbName;
+ }
+
+ public void setDbName(String dbName) {
+ this.dbName = dbName;
+ }
+
+ public boolean isDbConnected() {
+ return dbConnected;
+ }
+
+ public void setDbConnected(boolean dbConnected) {
+ logger.debug("JDBC::setDbConnected {}", dbConnected);
+ // Initializing step, after db is connected.
+ // Initialize sqlTypes, depending on DB version for example
+ dBDAO.initAfterFirstDbConnection();
+ // Running once again to prior external configured SqlTypes!
+ setSqlTypes();
+ this.dbConnected = dbConnected;
+ }
+
+ public boolean isDriverAvailable() {
+ return driverAvailable;
+ }
+
+}
diff --git a/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/internal/JdbcMapper.java b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/internal/JdbcMapper.java
new file mode 100644
index 0000000000000..3480d097c56bc
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/internal/JdbcMapper.java
@@ -0,0 +1,406 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.internal;
+
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+import org.knowm.yank.Yank;
+import org.openhab.core.items.Item;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.persistence.jdbc.model.ItemVO;
+import org.openhab.persistence.jdbc.model.ItemsVO;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Mapper class
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class JdbcMapper {
+ static final Logger logger = LoggerFactory.getLogger(JdbcMapper.class);
+
+ // Error counter - used to reconnect to database on error
+ protected int errCnt;
+ protected boolean initialized = false;
+ protected JdbcConfiguration conf = null;
+ protected Map sqlTables = new HashMap();
+ private long afterAccessMin = 10000;
+ private long afterAccessMax = 0;
+ private static final String ITEM_NAME_PATTERN = "[^a-zA-Z_0-9\\-]";
+
+ /*****************
+ * MAPPER ITEMS *
+ *****************/
+ public boolean pingDB() {
+ logger.debug("JDBC::pingDB");
+ boolean ret = false;
+ long timerStart = System.currentTimeMillis();
+ if (openConnection()) {
+ if (conf.getDbName() == null) {
+ logger.debug(
+ "JDBC::pingDB asking db for name as absolutely first db action, after connection is established.");
+ String dbName = conf.getDBDAO().doGetDB();
+ conf.setDbName(dbName);
+ ret = dbName.length() > 0;
+ } else {
+ ret = conf.getDBDAO().doPingDB() > 0;
+ }
+ }
+ logTime("pingDB", timerStart, System.currentTimeMillis());
+ return ret;
+ }
+
+ public String getDB() {
+ logger.debug("JDBC::getDB");
+ long timerStart = System.currentTimeMillis();
+ String res = conf.getDBDAO().doGetDB();
+ logTime("pingDB", timerStart, System.currentTimeMillis());
+ return res;
+ }
+
+ public ItemsVO createNewEntryInItemsTable(ItemsVO vo) {
+ logger.debug("JDBC::createNewEntryInItemsTable");
+ long timerStart = System.currentTimeMillis();
+ Long i = conf.getDBDAO().doCreateNewEntryInItemsTable(vo);
+ vo.setItemid(i.intValue());
+ logTime("doCreateNewEntryInItemsTable", timerStart, System.currentTimeMillis());
+ return vo;
+ }
+
+ public boolean createItemsTableIfNot(ItemsVO vo) {
+ logger.debug("JDBC::createItemsTableIfNot");
+ long timerStart = System.currentTimeMillis();
+ conf.getDBDAO().doCreateItemsTableIfNot(vo);
+ logTime("doCreateItemsTableIfNot", timerStart, System.currentTimeMillis());
+ return true;
+ }
+
+ public ItemsVO deleteItemsEntry(ItemsVO vo) {
+ logger.debug("JDBC::deleteItemsEntry");
+ long timerStart = System.currentTimeMillis();
+ conf.getDBDAO().doDeleteItemsEntry(vo);
+ logTime("deleteItemsEntry", timerStart, System.currentTimeMillis());
+ return vo;
+ }
+
+ public List getItemIDTableNames() {
+ logger.debug("JDBC::getItemIDTableNames");
+ long timerStart = System.currentTimeMillis();
+ List vo = conf.getDBDAO().doGetItemIDTableNames(new ItemsVO());
+ logTime("getItemIDTableNames", timerStart, System.currentTimeMillis());
+ return vo;
+ }
+
+ public List getItemTables() {
+ logger.debug("JDBC::getItemTables");
+ long timerStart = System.currentTimeMillis();
+ ItemsVO vo = new ItemsVO();
+ vo.setJdbcUriDatabaseName(conf.getDbName());
+ List vol = conf.getDBDAO().doGetItemTables(vo);
+ logTime("getItemTables", timerStart, System.currentTimeMillis());
+ return vol;
+ }
+
+ /****************
+ * MAPPERS ITEM *
+ ****************/
+ public void updateItemTableNames(List vol) {
+ logger.debug("JDBC::updateItemTableNames");
+ long timerStart = System.currentTimeMillis();
+ conf.getDBDAO().doUpdateItemTableNames(vol);
+ logTime("updateItemTableNames", timerStart, System.currentTimeMillis());
+ }
+
+ public ItemVO createItemTable(ItemVO vo) {
+ logger.debug("JDBC::createItemTable");
+ long timerStart = System.currentTimeMillis();
+ conf.getDBDAO().doCreateItemTable(vo);
+ logTime("createItemTable", timerStart, System.currentTimeMillis());
+ return vo;
+ }
+
+ public Item storeItemValue(Item item) {
+ logger.debug("JDBC::storeItemValue: item={}", item.toString());
+ String tableName = getTable(item);
+ if (tableName == null) {
+ logger.error("JDBC::store: Unable to store item '{}'.", item.getName());
+ return item;
+ }
+ long timerStart = System.currentTimeMillis();
+ conf.getDBDAO().doStoreItemValue(item, new ItemVO(tableName, null));
+ logTime("storeItemValue", timerStart, System.currentTimeMillis());
+ errCnt = 0;
+ return item;
+ }
+
+ public List getHistItemFilterQuery(FilterCriteria filter, int numberDecimalcount, String table,
+ Item item) {
+ logger.debug(
+ "JDBC::getHistItemFilterQuery filter='{}' numberDecimalcount='{}' table='{}' item='{}' itemName='{}'",
+ (filter != null), numberDecimalcount, table, item, item.getName());
+ if (table != null) {
+ long timerStart = System.currentTimeMillis();
+ List r = conf.getDBDAO().doGetHistItemFilterQuery(item, filter, numberDecimalcount, table,
+ item.getName());
+ logTime("insertItemValue", timerStart, System.currentTimeMillis());
+ return r;
+ } else {
+ logger.error("JDBC::getHistItemFilterQuery: TABLE is NULL; cannot get data from non-existent table.");
+ }
+ return null;
+ }
+
+ /***********************
+ * DATABASE CONNECTION *
+ ***********************/
+ protected boolean openConnection() {
+ logger.debug("JDBC::openConnection isDriverAvailable: {}", conf.isDriverAvailable());
+ if (conf.isDriverAvailable() && !conf.isDbConnected()) {
+ logger.info("JDBC::openConnection: Driver is available::Yank setupDataSource");
+ Yank.setupDefaultConnectionPool(conf.getHikariConfiguration());
+ conf.setDbConnected(true);
+ return true;
+ } else if (!conf.isDriverAvailable()) {
+ logger.warn("JDBC::openConnection: no driver available!");
+ initialized = false;
+ return false;
+ }
+ return true;
+ }
+
+ protected void closeConnection() {
+ logger.debug("JDBC::closeConnection");
+ // Closes all open connection pools
+ Yank.releaseDefaultConnectionPool();
+ conf.setDbConnected(false);
+ }
+
+ protected boolean checkDBAccessability() {
+ // Check if connection is valid
+ if (initialized) {
+ return true;
+ }
+ // first
+ boolean p = pingDB();
+ if (p) {
+ logger.debug("JDBC::checkDBAcessability, first try connection: {}", p);
+ return (p && !(conf.getErrReconnectThreshold() > 0 && errCnt <= conf.getErrReconnectThreshold()));
+ } else {
+ // second
+ p = pingDB();
+ logger.debug("JDBC::checkDBAcessability, second try connection: {}", p);
+ return (p && !(conf.getErrReconnectThreshold() > 0 && errCnt <= conf.getErrReconnectThreshold()));
+ }
+ }
+
+ /**************************
+ * DATABASE TABLEHANDLING *
+ **************************/
+ protected void checkDBSchema() {
+ // Create Items Table if does not exist
+ createItemsTableIfNot(new ItemsVO());
+ if (conf.getRebuildTableNames()) {
+ formatTableNames();
+ } else {
+ List al;
+ // Reset the error counter
+ errCnt = 0;
+ al = getItemIDTableNames();
+ for (int i = 0; i < al.size(); i++) {
+ String t = getTableName(al.get(i).getItemid(), al.get(i).getItemname());
+ sqlTables.put(al.get(i).getItemname(), t);
+ }
+ }
+ }
+
+ protected String getTable(Item item) {
+ int rowId = 0;
+ ItemsVO isvo;
+ ItemVO ivo;
+
+ String itemName = item.getName();
+ String tableName = sqlTables.get(itemName);
+
+ // Table already exists - return the name
+ if (tableName != null) {
+ return tableName;
+ }
+
+ logger.debug("JDBC::getTable: no table found for item '{}' in sqlTables", itemName);
+
+ // Create a new entry in items table
+ isvo = new ItemsVO();
+ isvo.setItemname(itemName);
+ isvo = createNewEntryInItemsTable(isvo);
+ rowId = isvo.getItemid();
+ if (rowId == 0) {
+ logger.error("JDBC::getTable: Creating table for item '{}' failed.", itemName);
+ }
+ // Create the table name
+ logger.debug("JDBC::getTable: getTableName with rowId={} itemName={}", rowId, itemName);
+ tableName = getTableName(rowId, itemName);
+
+ // An error occurred adding the item name into the index list!
+ if (tableName == null) {
+ logger.error("JDBC::getTable: tableName was null; could not create a table for item '{}'", itemName);
+ return null;
+ }
+
+ // Create table for item
+ String dataType = conf.getDBDAO().getDataType(item);
+ ivo = new ItemVO(tableName, itemName);
+ ivo.setDbType(dataType);
+ ivo = createItemTable(ivo);
+ logger.debug("JDBC::getTable: Table created for item '{}' with dataType {} in SQL database.", itemName,
+ dataType);
+ sqlTables.put(itemName, tableName);
+
+ // Check if the new entry is in the table list
+ // If it's not in the list, then there was an error and we need to do
+ // some tidying up
+ // The item needs to be removed from the index table to avoid duplicates
+ if (sqlTables.get(itemName) == null) {
+ logger.error("JDBC::getTable: Item '{}' was not added to the table - removing index", itemName);
+ isvo = new ItemsVO();
+ isvo.setItemname(itemName);
+ deleteItemsEntry(isvo);
+ }
+
+ return tableName;
+ }
+
+ private void formatTableNames() {
+
+ boolean tmpinit = initialized;
+ if (tmpinit) {
+ initialized = false;
+ }
+
+ List al;
+ HashMap tableIds = new HashMap();
+
+ //
+ al = getItemIDTableNames();
+ for (int i = 0; i < al.size(); i++) {
+ String t = getTableName(al.get(i).getItemid(), al.get(i).getItemname());
+ sqlTables.put(al.get(i).getItemname(), t);
+ tableIds.put(al.get(i).getItemid(), t);
+ }
+
+ //
+ al = getItemTables();
+
+ String oldName = "";
+ String newName = "";
+ List oldNewTablenames = new ArrayList();
+ for (int i = 0; i < al.size(); i++) {
+ int id = -1;
+ oldName = al.get(i).getTable_name();
+ logger.info("JDBC::formatTableNames: found Table Name= {}", oldName);
+
+ if (oldName.startsWith(conf.getTableNamePrefix()) && !oldName.contains("_")) {
+ id = Integer.parseInt(oldName.substring(conf.getTableNamePrefix().length()));
+ logger.info("JDBC::formatTableNames: found Table with Prefix '{}' Name= {} id= {}",
+ conf.getTableNamePrefix(), oldName, (id));
+ } else if (oldName.contains("_")) {
+ id = Integer.parseInt(oldName.substring(oldName.lastIndexOf("_") + 1));
+ logger.info("JDBC::formatTableNames: found Table Name= {} id= {}", oldName, (id));
+ }
+ logger.info("JDBC::formatTableNames: found Table id= {}", id);
+
+ newName = tableIds.get(id);
+ logger.info("JDBC::formatTableNames: found Table newName= {}", newName);
+
+ if (newName != null) {
+ if (!oldName.equalsIgnoreCase(newName)) {
+ oldNewTablenames.add(new ItemVO(oldName, newName));
+ logger.info("JDBC::formatTableNames: Table '{}' will be renamed to '{}'", oldName, newName);
+ } else {
+ logger.info("JDBC::formatTableNames: Table oldName='{}' newName='{}' nothing to rename", oldName,
+ newName);
+ }
+ } else {
+ logger.error("JDBC::formatTableNames: Table '{}' could NOT be renamed to '{}'", oldName, newName);
+ break;
+ }
+ }
+
+ //
+ updateItemTableNames(oldNewTablenames);
+ initialized = tmpinit;
+ }
+
+ private String getTableName(int rowId, String itemName) {
+ return getTableNamePrefix(itemName) + formatRight(rowId, conf.getTableIdDigitCount());
+ }
+
+ private String getTableNamePrefix(String itemName) {
+ String name = conf.getTableNamePrefix();
+ if (conf.getTableUseRealItemNames()) {
+ // Create the table name with real Item Names
+ name = (itemName.replaceAll(ITEM_NAME_PATTERN, "") + "_").toLowerCase();
+ }
+ return name;
+ }
+
+ private static String formatRight(final Object value, final int len) {
+ final String valueAsString = String.valueOf(value);
+ if (valueAsString.length() < len) {
+ final StringBuffer result = new StringBuffer(len);
+ for (int i = len - valueAsString.length(); i > 0; i--) {
+ result.append('0');
+ }
+ result.append(valueAsString);
+ return result.toString();
+ } else {
+ return valueAsString;
+ }
+ }
+
+ /*****************
+ * H E L P E R S *
+ *****************/
+ private void logTime(String me, long timerStart, long timerStop) {
+ if (conf.enableLogTime && logger.isInfoEnabled()) {
+ conf.timerCount++;
+ int timerDiff = (int) (timerStop - timerStart);
+ if (timerDiff < afterAccessMin) {
+ afterAccessMin = timerDiff;
+ }
+ if (timerDiff > afterAccessMax) {
+ afterAccessMax = timerDiff;
+ }
+ conf.timeAverage50arr.add(timerDiff);
+ conf.timeAverage100arr.add(timerDiff);
+ conf.timeAverage200arr.add(timerDiff);
+ if (conf.timerCount == 1) {
+ conf.timer1000 = System.currentTimeMillis();
+ }
+ if (conf.timerCount == 1001) {
+ conf.time1000Statements = Math.round(((int) (System.currentTimeMillis() - conf.timer1000)) / 1000);// Seconds
+ conf.timerCount = 0;
+ }
+ logger.info(
+ "JDBC::logTime: '{}':\n afterAccess = {} ms\n timeAverage50 = {} ms\n timeAverage100 = {} ms\n timeAverage200 = {} ms\n afterAccessMin = {} ms\n afterAccessMax = {} ms\n 1000Statements = {} sec\n statementCount = {}\n",
+ me, timerDiff, conf.timeAverage50arr.getAverageInteger(),
+ conf.timeAverage100arr.getAverageInteger(), conf.timeAverage200arr.getAverageInteger(),
+ afterAccessMin, afterAccessMax, conf.time1000Statements, conf.timerCount);
+ }
+ }
+
+}
diff --git a/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/internal/JdbcPersistenceService.java b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/internal/JdbcPersistenceService.java
new file mode 100644
index 0000000000000..e309b761ec263
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/internal/JdbcPersistenceService.java
@@ -0,0 +1,232 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.internal;
+
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.List;
+import java.util.Locale;
+import java.util.Map;
+import java.util.Set;
+
+import org.eclipse.jdt.annotation.NonNull;
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.core.items.GroupItem;
+import org.openhab.core.items.Item;
+import org.openhab.core.items.ItemNotFoundException;
+import org.openhab.core.items.ItemRegistry;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.persistence.PersistenceItemInfo;
+import org.openhab.core.persistence.PersistenceService;
+import org.openhab.core.persistence.QueryablePersistenceService;
+import org.openhab.core.persistence.strategy.PersistenceStrategy;
+import org.openhab.core.types.UnDefType;
+import org.osgi.framework.BundleContext;
+import org.osgi.service.component.annotations.Activate;
+import org.osgi.service.component.annotations.Component;
+import org.osgi.service.component.annotations.ConfigurationPolicy;
+import org.osgi.service.component.annotations.Deactivate;
+import org.osgi.service.component.annotations.Reference;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Persistence service implementation
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ * @author Kai Kreuzer - Migration to 3.x
+ *
+ */
+@NonNullByDefault
+@Component(service = { PersistenceService.class,
+ QueryablePersistenceService.class }, configurationPid = "org.openhab.jdbc", configurationPolicy = ConfigurationPolicy.REQUIRE)
+public class JdbcPersistenceService extends JdbcMapper implements QueryablePersistenceService {
+ private final Logger logger = LoggerFactory.getLogger(JdbcPersistenceService.class);
+
+ @Reference
+ protected @NonNullByDefault({}) ItemRegistry itemRegistry;
+
+ /**
+ * Called by the SCR to activate the component with its configuration read
+ * from CAS
+ *
+ * @param bundleContext
+ * BundleContext of the Bundle that defines this component
+ * @param configuration
+ * Configuration properties for this component obtained from the
+ * ConfigAdmin service
+ */
+ @Activate
+ public void activate(BundleContext bundleContext, Map configuration) {
+ logger.debug("JDBC::activate: persistence service activated");
+ updateConfig(configuration);
+ }
+
+ /**
+ * Called by the SCR to deactivate the component when either the
+ * configuration is removed or mandatory references are no longer satisfied
+ * or the component has simply been stopped.
+ *
+ * @param reason
+ * Reason code for the deactivation:
+ *
+ * 0 – Unspecified
+ * 1 – The component was disabled
+ * 2 – A reference became unsatisfied
+ * 3 – A configuration was changed
+ * 4 – A configuration was deleted
+ * 5 – The component was disposed
+ * 6 – The bundle was stopped
+ *
+ */
+ @Deactivate
+ public void deactivate(final int reason) {
+ logger.debug("JDBC::deactivate: persistence bundle stopping. Disconnecting from database. reason={}", reason);
+ // closeConnection();
+ initialized = false;
+ }
+
+ @Override
+ public String getId() {
+ logger.debug("JDBC::getName: returning name 'jdbc' for queryable persistence service.");
+ return "jdbc";
+ }
+
+ @Override
+ public String getLabel(@Nullable Locale locale) {
+ return "JDBC";
+ }
+
+ @Override
+ public void store(Item item) {
+ store(item, null);
+ }
+
+ /**
+ * @{inheritDoc
+ */
+ @Override
+ public void store(Item item, @Nullable String alias) {
+ // Don not store undefined/uninitialised data
+ if (item.getState() instanceof UnDefType) {
+ logger.debug("JDBC::store: ignore Item '{}' because it is UnDefType", item.getName());
+ return;
+ }
+ if (!checkDBAccessability()) {
+ logger.warn(
+ "JDBC::store: No connection to database. Cannot persist item '{}'! Will retry connecting to database when error count:{} equals errReconnectThreshold:{}",
+ item, errCnt, conf.getErrReconnectThreshold());
+ return;
+ }
+ long timerStart = System.currentTimeMillis();
+ storeItemValue(item);
+ logger.debug("JDBC: Stored item '{}' as '{}' in SQL database at {} in {} ms.", item.getName(),
+ item.getState().toString(), (new java.util.Date()).toString(), System.currentTimeMillis() - timerStart);
+ }
+
+ @Override
+ public @NonNull Set<@NonNull PersistenceItemInfo> getItemInfo() {
+ return Collections.emptySet();
+ }
+
+ /**
+ * Queries the {@link PersistenceService} for data with a given filter
+ * criteria
+ *
+ * @param filter
+ * the filter to apply to the query
+ * @return a time series of items
+ */
+ @Override
+ public Iterable query(FilterCriteria filter) {
+
+ if (!checkDBAccessability()) {
+ logger.warn("JDBC::query: database not connected, query aborted for item '{}'", filter.getItemName());
+ return Collections.emptyList();
+ }
+ if (itemRegistry == null) {
+ logger.error("JDBC::query: itemRegistry == null. Ignore and give up!");
+ return Collections.emptyList();
+ }
+
+ // Get the item name from the filter
+ // Also get the Item object so we can determine the type
+ Item item = null;
+ String itemName = filter.getItemName();
+ logger.debug("JDBC::query: item is {}", itemName);
+ try {
+ item = itemRegistry.getItem(itemName);
+ } catch (ItemNotFoundException e1) {
+ logger.error("JDBC::query: unable to get item for itemName: '{}'. Ignore and give up!", itemName);
+ return Collections.emptyList();
+ }
+
+ if (item instanceof GroupItem) {
+ // For Group Item is BaseItem needed to get correct Type of Value.
+ item = GroupItem.class.cast(item).getBaseItem();
+ logger.debug("JDBC::query: item is instanceof GroupItem '{}'", itemName);
+ if (item == null) {
+ logger.debug("JDBC::query: BaseItem of GroupItem is null. Ignore and give up!");
+ return Collections.emptyList();
+ }
+ if (item instanceof GroupItem) {
+ logger.debug("JDBC::query: BaseItem of GroupItem is a GroupItem too. Ignore and give up!");
+ return Collections.emptyList();
+ }
+ }
+
+ String table = sqlTables.get(itemName);
+ if (table == null) {
+ logger.warn(
+ "JDBC::query: unable to find table for query, no data in database for item '{}'. Current number of tables in the database: {}",
+ itemName, sqlTables.size());
+ // if enabled, table will be created immediately
+ logger.warn("JDBC::query: try to generate the table for item '{}'", itemName);
+ table = getTable(item);
+ }
+
+ long timerStart = System.currentTimeMillis();
+ List items = new ArrayList();
+ items = getHistItemFilterQuery(filter, conf.getNumberDecimalcount(), table, item);
+
+ logger.debug("JDBC::query: query for {} returned {} rows in {} ms", item.getName(), items.size(),
+ System.currentTimeMillis() - timerStart);
+
+ // Success
+ errCnt = 0;
+ return items;
+ }
+
+ public void updateConfig(Map configuration) {
+ logger.debug("JDBC::updateConfig");
+
+ conf = new JdbcConfiguration(configuration);
+ if (conf.valid && checkDBAccessability()) {
+ checkDBSchema();
+ // connection has been established ... initialization completed!
+ initialized = true;
+ } else {
+ initialized = false;
+ }
+
+ logger.debug("JDBC::updateConfig: configuration complete for service={}.", getId());
+ }
+
+ @Override
+ public List getDefaultStrategies() {
+ return Collections.emptyList();
+ }
+
+}
diff --git a/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/model/ItemVO.java b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/model/ItemVO.java
new file mode 100644
index 0000000000000..64e05e16c523f
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/model/ItemVO.java
@@ -0,0 +1,167 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.model;
+
+import java.io.Serializable;
+import java.util.Date;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Represents the Item-data on the part of MyBatis/database.
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class ItemVO implements Serializable {
+ private final Logger logger = LoggerFactory.getLogger(ItemVO.class);
+
+ private static final long serialVersionUID = 1871441039821454890L;
+
+ private String tableName;
+ private String newTableName;
+ private String dbType;
+ private String jdbcType;
+ private String itemType;
+ private Class> javaType;
+ private Date time;
+ private Object value;
+
+ public ItemVO(String tableName, String newTableName) {
+ logger.debug("JDBC:ItemVO tableName={}; newTableName={}; ", tableName, newTableName);
+ this.tableName = tableName;
+ this.newTableName = newTableName;
+ }
+
+ public ItemVO() {
+ }
+
+ public void setValueTypes(String dbType, Class> javaType) {
+ logger.debug("JDBC:ItemVO setValueTypes dbType={}; javaType={};", dbType, javaType);
+ this.dbType = dbType;
+ this.javaType = javaType;
+ }
+
+ public String getTableName() {
+ return tableName;
+ }
+
+ public void setTableName(String tableName) {
+ this.tableName = tableName;
+ }
+
+ public String getNewTableName() {
+ return newTableName;
+ }
+
+ public void setNewTableName(String newTableName) {
+ this.newTableName = newTableName;
+ }
+
+ public String getDbType() {
+ return dbType;
+ }
+
+ public void setDbType(String dbType) {
+ this.dbType = dbType;
+ }
+
+ public String getJdbcType() {
+ return jdbcType;
+ }
+
+ public void setJdbcType(String jdbcType) {
+ this.jdbcType = jdbcType;
+ }
+
+ public String getItemType() {
+ return itemType;
+ }
+
+ public void setItemType(String itemType) {
+ this.itemType = itemType;
+ }
+
+ public String getJavaType() {
+ return javaType.getName();
+ }
+
+ public void setJavaType(Class> javaType) {
+ this.javaType = javaType;
+ }
+
+ public Date getTime() {
+ return time;
+ }
+
+ public void setTime(Date time) {
+ this.time = time;
+ }
+
+ public Object getValue() {
+ return value;
+ }
+
+ public void setValue(Object value) {
+ this.value = value;
+ }
+
+ /**
+ * (non-Javadoc)
+ *
+ * @see java.lang.Object#equals(java.lang.Object)
+ */
+ @Override
+ public boolean equals(Object obj) {
+ if (this == obj) {
+ return true;
+ }
+ if (obj == null) {
+ return false;
+ }
+ if (getClass() != obj.getClass()) {
+ return false;
+ }
+ ItemVO other = (ItemVO) obj;
+ if (value == null) {
+ if (other.value != null) {
+ return false;
+ }
+ } else if (!value.equals(other.value)) {
+ return false;
+ }
+ if (time != other.time) {
+ return false;
+ }
+ return true;
+ }
+
+ @Override
+ public String toString() {
+ StringBuilder builder = new StringBuilder();
+ builder.append("ItemVO [tableName=");
+ builder.append(tableName);
+ builder.append(", newTableName=");
+ builder.append(newTableName);
+ builder.append(", dbType=");
+ builder.append(dbType);
+ builder.append(", javaType=");
+ builder.append(javaType);
+ builder.append(", time=");
+ builder.append(time);
+ builder.append(", value=");
+ builder.append(value);
+ builder.append("]");
+ return builder.toString();
+ }
+}
diff --git a/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/model/ItemsVO.java b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/model/ItemsVO.java
new file mode 100644
index 0000000000000..6415ab6dc3057
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/model/ItemsVO.java
@@ -0,0 +1,155 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.model;
+
+import java.io.Serializable;
+
+/**
+ * Represents the table naming data.
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class ItemsVO implements Serializable {
+
+ private static final long serialVersionUID = 2871961811177601520L;
+
+ private static final String STR_FILTER = "[^a-zA-Z0-9]";
+
+ private String coltype = "VARCHAR(500)";
+ private String colname = "itemname";
+ private String itemsManageTable = "items";
+ private int itemid;
+ private String itemname;
+ private String table_name;
+ private String jdbcUriDatabaseName;
+
+ public String getColtype() {
+ return coltype;
+ }
+
+ public void setColtype(String coltype) {
+ this.coltype = coltype.replaceAll(STR_FILTER, "");
+ }
+
+ public String getColname() {
+ return colname;
+ }
+
+ public void setColname(String colname) {
+ this.colname = colname.replaceAll(STR_FILTER, "");
+ }
+
+ public String getItemsManageTable() {
+ return itemsManageTable;
+ }
+
+ public void setItemsManageTable(String itemsManageTable) {
+ this.itemsManageTable = itemsManageTable.replaceAll(STR_FILTER, "");
+ }
+
+ public int getItemid() {
+ return itemid;
+ }
+
+ public void setItemid(int itemid) {
+ this.itemid = itemid;
+ }
+
+ public String getItemname() {
+ return itemname;
+ }
+
+ public void setItemname(String itemname) {
+ this.itemname = itemname;
+ }
+
+ public String getTable_name() {
+ return table_name;
+ }
+
+ public void setTable_name(String table_name) {
+ this.table_name = table_name;
+ }
+
+ public String getJdbcUriDatabaseName() {
+ return jdbcUriDatabaseName;
+ }
+
+ public void setJdbcUriDatabaseName(String jdbcUriDatabaseName) {
+ this.jdbcUriDatabaseName = jdbcUriDatabaseName;
+ }
+
+ /*
+ * (non-Javadoc)
+ *
+ * @see java.lang.Object#hashCode()
+ */
+ @Override
+ public int hashCode() {
+ final int prime = 31;
+ int result = 1;
+ result = prime * result + ((itemname == null) ? 0 : itemname.hashCode());
+ result = prime * result + (itemid ^ (itemid >>> 32));
+ return result;
+ }
+
+ /*
+ * (non-Javadoc)
+ *
+ * @see java.lang.Object#equals(java.lang.Object)
+ */
+ @Override
+ public boolean equals(Object obj) {
+ if (this == obj) {
+ return true;
+ }
+ if (obj == null) {
+ return false;
+ }
+ if (getClass() != obj.getClass()) {
+ return false;
+ }
+ ItemsVO other = (ItemsVO) obj;
+ if (itemname == null) {
+ if (other.itemname != null) {
+ return false;
+ }
+ } else if (!itemname.equals(other.itemname)) {
+ return false;
+ }
+ if (itemid != other.itemid) {
+ return false;
+ }
+ return true;
+ }
+
+ @Override
+ public String toString() {
+ StringBuilder builder = new StringBuilder();
+ builder.append("ItemsVO [coltype=");
+ builder.append(coltype);
+ builder.append(", colname=");
+ builder.append(colname);
+ builder.append(", itemsManageTable=");
+ builder.append(itemsManageTable);
+ builder.append(", itemid=");
+ builder.append(itemid);
+ builder.append(", itemname=");
+ builder.append(itemname);
+ builder.append(", table_name=");
+ builder.append(table_name);
+ builder.append("]");
+ return builder.toString();
+ }
+
+}
diff --git a/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/model/JdbcItem.java b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/model/JdbcItem.java
new file mode 100644
index 0000000000000..65e885fdfdeef
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/model/JdbcItem.java
@@ -0,0 +1,65 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.model;
+
+import java.util.Date;
+
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.types.State;
+
+/**
+ * Represents the data on the part of openHAB.
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class JdbcItem implements HistoricItem {
+
+ private final String name;
+ private final State state;
+ private final Date timestamp;
+
+ public JdbcItem(String name, State state, Date timestamp) {
+ this.name = name;
+ this.state = state;
+ this.timestamp = timestamp;
+ }
+
+ @Override
+ public String getName() {
+ return name;
+ }
+
+ @Override
+ public State getState() {
+ return state;
+ }
+
+ @Override
+ public Date getTimestamp() {
+ return timestamp;
+ }
+
+ @Override
+ public String toString() {
+ StringBuilder builder = new StringBuilder();
+ builder.append("JdbcItem [name=");
+ builder.append(name);
+ builder.append(", state=");
+ builder.append(state);
+ builder.append(", timestamp=");
+ builder.append(timestamp);
+ builder.append("]");
+ return builder.toString();
+ }
+
+}
diff --git a/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/utils/DbMetaData.java b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/utils/DbMetaData.java
new file mode 100644
index 0000000000000..fc3a7402bb256
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/utils/DbMetaData.java
@@ -0,0 +1,129 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.utils;
+
+import java.sql.DatabaseMetaData;
+import java.sql.SQLException;
+
+import org.knowm.yank.Yank;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import com.zaxxer.hikari.HikariDataSource;
+
+/**
+ * Meta data class
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class DbMetaData {
+
+ private static final Logger logger = LoggerFactory.getLogger(DbMetaData.class);
+
+ private int dbMajorVersion;
+ private int dbMinorVersion;
+ private int driverMajorVersion;
+ private int driverMinorVersion;
+ private String dbProductName;
+ private String dbProductVersion;
+
+ public DbMetaData() {
+ HikariDataSource h = Yank.getDefaultConnectionPool();
+ // HikariDataSource h = Yank.getDataSource();
+
+ DatabaseMetaData meta;
+ try {
+ meta = h.getConnection().getMetaData();
+
+ // Oracle (and some other vendors) do not support
+ // some the following methods; therefore, we need
+ // to use try-catch block.
+ try {
+ dbMajorVersion = meta.getDatabaseMajorVersion();
+ logger.debug("dbMajorVersion = '{}'", dbMajorVersion);
+ } catch (Exception e) {
+ logger.error("Asking for 'dbMajorVersion' is unsupported: '{}'", e.getMessage());
+ }
+
+ try {
+ dbMinorVersion = meta.getDatabaseMinorVersion();
+ logger.debug("dbMinorVersion = '{}'", dbMinorVersion);
+ } catch (Exception e) {
+ logger.error("Asking for 'dbMinorVersion' is unsupported: '{}'", e.getMessage());
+ }
+
+ driverMajorVersion = meta.getDriverMajorVersion();
+ logger.debug("driverMajorVersion = '{}'", driverMajorVersion);
+
+ driverMinorVersion = meta.getDriverMinorVersion();
+ logger.debug("driverMinorVersion = '{}'", driverMinorVersion);
+
+ dbProductName = meta.getDatabaseProductName();
+ logger.debug("dbProductName = '{}'", dbProductName);
+
+ dbProductVersion = meta.getDatabaseProductVersion();
+ logger.debug("dbProductVersion = '{}'", dbProductVersion);
+
+ } catch (SQLException e1) {
+ logger.error("Asking for 'dbMajorVersion' seems to be unsupported: '{}'", e1.getMessage());
+ }
+
+ }
+
+ public int getDbMajorVersion() {
+ return dbMajorVersion;
+ }
+
+ public int getDbMinorVersion() {
+ return dbMinorVersion;
+ }
+
+ public boolean isDbVersionGreater(int major, int minor) {
+ if (dbMajorVersion > major) {
+ return true;
+ } else if (dbMajorVersion == major) {
+ if (dbMinorVersion > minor) {
+ return true;
+ }
+ }
+ return false;
+ }
+
+ public int getDriverMajorVersion() {
+ return driverMajorVersion;
+ }
+
+ public int getDriverMinorVersion() {
+ return driverMinorVersion;
+ }
+
+ public boolean isDriverVersionGreater(int major, int minor) {
+ if (major > driverMajorVersion) {
+ return true;
+ } else if (major == driverMajorVersion) {
+ if (minor > driverMinorVersion) {
+ return true;
+ }
+ }
+ return false;
+ }
+
+ public String getDbProductName() {
+ return dbProductName;
+ }
+
+ public String getDbProductVersion() {
+ return dbProductVersion;
+ }
+
+}
diff --git a/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/utils/MovingAverage.java b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/utils/MovingAverage.java
new file mode 100644
index 0000000000000..15ccd4adcd54f
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/utils/MovingAverage.java
@@ -0,0 +1,72 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.utils;
+
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.LinkedList;
+import java.util.Queue;
+//import org.apache.commons.math3.stat.StatUtils;
+
+/**
+ * Calculates the average/mean of a number series.
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class MovingAverage {
+
+ private final Queue win = new LinkedList();
+ private final int period;
+ private BigDecimal sum = BigDecimal.ZERO;
+
+ public MovingAverage(int period) {
+ assert period > 0 : "Period must be a positive integer";
+ this.period = period;
+ }
+
+ public void add(Double num) {
+ add(new BigDecimal(num));
+ }
+
+ public void add(Long num) {
+ add(new BigDecimal(num));
+ }
+
+ public void add(Integer num) {
+ add(new BigDecimal(num));
+ }
+
+ public void add(BigDecimal num) {
+ sum = sum.add(num);
+ win.add(num);
+ if (win.size() > period) {
+ sum = sum.subtract(win.remove());
+ }
+ }
+
+ public BigDecimal getAverage() {
+ if (win.isEmpty()) {
+ return BigDecimal.ZERO; // technically the average is undefined
+ }
+ BigDecimal divisor = BigDecimal.valueOf(win.size());
+ return sum.divide(divisor, 2, RoundingMode.HALF_UP);
+ }
+
+ public double getAverageDouble() {
+ return getAverage().doubleValue();
+ }
+
+ public int getAverageInteger() {
+ return getAverage().intValue();
+ }
+}
diff --git a/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/utils/StringUtilsExt.java b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/utils/StringUtilsExt.java
new file mode 100644
index 0000000000000..c8d1afa690013
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/src/main/java/org/openhab/persistence/jdbc/utils/StringUtilsExt.java
@@ -0,0 +1,296 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.utils;
+
+import java.net.URI;
+import java.net.URISyntaxException;
+import java.util.ArrayList;
+import java.util.Properties;
+
+import org.openhab.core.persistence.FilterCriteria;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Utility class
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class StringUtilsExt {
+ private static final Logger logger = LoggerFactory.getLogger(StringUtilsExt.class);
+
+ /**
+ * Replaces multiple found words with the given Array contents
+ *
+ * @param str - String for replacement
+ * @param separate - A String or Array to be replaced
+ * @param separators - Array will be merged to str
+ * @return
+ */
+ public static final String replaceArrayMerge(String str, String separate, Object[] separators) {
+ for (int i = 0; i < separators.length; i++) {
+ str = str.replaceFirst(separate, (String) separators[i]);
+ }
+ return str;
+ }
+
+ /**
+ * @see #replaceArrayMerge(String str, String separate, Object[] separators)
+ */
+ public static final String replaceArrayMerge(String str, String[] separate, String[] separators) {
+ for (int i = 0; i < separators.length; i++) {
+ str = str.replaceFirst(separate[i], separators[i]);
+ }
+ return str;
+ }
+
+ /**
+ * @see #parseJdbcURL(String url, Properties def)
+ */
+ public static Properties parseJdbcURL(String url) {
+ return parseJdbcURL(url, null);
+ }
+
+ /**
+ * JDBC-URI Examples:
+ * jdbc:dbShortcut:c:/dev/databaseName
+ * jdbc:dbShortcut:scheme:c:/dev/databaseName
+ * jdbc:dbShortcut:scheme:c:\\dev\\databaseName
+ * jdbc:dbShortcut:./databaseName
+ * jdbc:dbShortcut:/databaseName
+ * jdbc:dbShortcut:~/databaseName
+ * jdbc:dbShortcut:/path/databaseName.db
+ * jdbc:dbShortcut:./../../path/databaseName
+ * jdbc:dbShortcut:scheme:./path/../path/databaseName;param1=true;
+ * jdbc:dbShortcut://192.168.0.145:3306/databaseName?param1=false¶m2=true
+ *
+ *
+ * @param url - JDBC-URI
+ * @param def - Predefined Properties Object
+ * @return A merged Properties Object may contain:
+ * parseValid (mandatory)
+ * scheme
+ * serverPath
+ * dbShortcut
+ * databaseName
+ * portNumber
+ * serverName
+ * pathQuery
+ */
+ public static Properties parseJdbcURL(String url, Properties def) {
+
+ Properties props;
+ if (def == null) {
+ props = new Properties();
+ } else {
+ props = new Properties(def);
+ }
+
+ if (url == null || url.length() < 9) {
+ return props;
+ }
+
+ // replace all \
+ if (url.contains("\\")) {
+ url = url.replaceAll("\\\\", "/");
+ }
+
+ // replace first ; with ?
+ if (url.contains(";")) {
+ // replace first ; with ?
+ url = url.replaceFirst(";", "?");
+ // replace other ; with &
+ url = url.replaceAll(";", "&");
+ }
+
+ if (url.split(":").length < 3 || url.indexOf("/") == -1) {
+ logger.error("parseJdbcURL: URI '{}' is not well formated, expected uri like 'jdbc:dbShortcut:/path'", url);
+ props.put("parseValid", "false");
+ return props;
+ }
+
+ String[] protAndDb = stringBeforeSubstr(url, ":", 1).split(":");
+ if (!"jdbc".equals(protAndDb[0])) {
+ logger.error("parseJdbcURL: URI '{}' is not well formated, expected suffix 'jdbc' found '{}'", url,
+ protAndDb[0]);
+ props.put("parseValid", "false");
+ return props;
+ }
+ props.put("parseValid", "true");
+ props.put("dbShortcut", protAndDb[1]);
+
+ URI dbURI = null;
+ try {
+ dbURI = new URI(stringAfterSubstr(url, ":", 1).replaceFirst(" ", ""));
+ if (dbURI.getScheme() != null) {
+ props.put("scheme", dbURI.getScheme());
+ dbURI = new URI(stringAfterSubstr(url, ":", 2).replaceFirst(" ", ""));
+ }
+ } catch (URISyntaxException e) {
+ logger.error("parseJdbcURL: URI '{}' is not well formated.", url, e);
+ return props;
+ }
+
+ // Query-Parameters
+ if (dbURI.getQuery() != null) {
+ String[] q = dbURI.getQuery().split("&");
+ for (int i = 0; i < q.length; i++) {
+ String[] t = q[i].split("=");
+ props.put(t[0], t[1]);
+ }
+ props.put("pathQuery", dbURI.getQuery());
+ }
+
+ String path = "";
+ if (dbURI.getPath() != null) {
+ String gp = dbURI.getPath();
+ String st = "/";
+ if (gp.indexOf("/") <= 1) {
+ if (substrPos(gp, st).size() > 1) {
+ path = stringBeforeLastSubstr(gp, st) + st;
+ } else {
+ path = stringBeforeSubstr(gp, st) + st;
+ }
+ }
+ if (dbURI.getScheme() != null && dbURI.getScheme().length() == 1) {
+ path = dbURI.getScheme() + ":" + path;
+ }
+ props.put("serverPath", path);
+ }
+ if (dbURI.getPath() != null) {
+ props.put("databaseName", stringAfterLastSubstr(dbURI.getPath(), "/"));
+ }
+ if (dbURI.getPort() != -1) {
+ props.put("portNumber", dbURI.getPort() + "");
+ }
+ if (dbURI.getHost() != null) {
+ props.put("serverName", dbURI.getHost());
+ }
+
+ return props;
+ }
+
+ /**
+ * @param s
+ * @param substr
+ * @return - Returns a String before the last occurrence of a Substring
+ */
+ public static String stringBeforeLastSubstr(String s, String substr) {
+ ArrayList a = substrPos(s, substr);
+ return s.substring(0, a.get(a.size() - 1));
+ }
+
+ /**
+ * @param s
+ * @param substr
+ * @return - Returns a String after the last occurrence of a Substring
+ */
+ public static String stringAfterLastSubstr(String s, String substr) {
+ ArrayList a = substrPos(s, substr);
+ return s.substring(a.get(a.size() - 1) + 1);
+ }
+
+ /**
+ * @param s
+ * @param substr
+ * @return - Returns a String after the first occurrence of a Substring
+ */
+ public static String stringAfterSubstr(String s, String substr) {
+ return s.substring(s.indexOf(substr) + 1);
+ }
+
+ /**
+ * @param s
+ * @param substr
+ * @param pos
+ * @return - Returns a String after the n occurrence of a Substring
+ */
+ public static String stringAfterSubstr(String s, String substr, int n) {
+ return s.substring(substrPos(s, substr).get(n) + 1);
+ }
+
+ /**
+ * @param s
+ * @param substr
+ * @return - Returns a String before the first occurrence of a Substring
+ */
+ public static String stringBeforeSubstr(String s, String substr) {
+ return s.substring(0, s.indexOf(substr));
+ }
+
+ /**
+ * @param s
+ * @param substr
+ * @param pos
+ * @return - Returns a String before the n occurrence of a Substring
+ */
+ public static String stringBeforeSubstr(String s, String substr, int n) {
+ return s.substring(0, substrPos(s, substr).get(n));
+ }
+
+ /**
+ * @param s
+ * @param substr
+ * @return - Returns an ArrayList with Indices of the occurrence of a Substring
+ */
+ public static ArrayList substrPos(String s, String substr) {
+ return substrPos(s, substr, true);
+ }
+
+ /**
+ * @param s
+ * @param substr
+ * @param ignoreCase
+ * @return - Returns an ArrayList with Indices of the occurrence of a Substring
+ */
+ public static ArrayList substrPos(String s, String substr, boolean ignoreCase) {
+ int substrLength = substr.length();
+ int strLength = s.length();
+ ArrayList arr = new ArrayList();
+
+ for (int i = 0; i < strLength - substrLength + 1; i++) {
+ if (s.regionMatches(ignoreCase, i, substr, 0, substrLength)) {
+ arr.add(i);
+ }
+ }
+ return arr;
+ }
+
+ /*
+ * (non-Javadoc)
+ *
+ * @see java.lang.Object#toString()
+ */
+ public static String filterToString(FilterCriteria filter) {
+ StringBuilder builder = new StringBuilder();
+ builder.append("FilterCriteria [itemName=");
+ builder.append(filter.getItemName());
+ builder.append(", beginDate=");
+ builder.append(filter.getBeginDate());
+ builder.append(", endDate=");
+ builder.append(filter.getEndDate());
+ builder.append(", pageNumber=");
+ builder.append(filter.getPageNumber());
+ builder.append(", pageSize=");
+ builder.append(filter.getPageSize());
+ builder.append(", operator=");
+ builder.append(filter.getOperator());
+ builder.append(", ordering=");
+ builder.append(filter.getOrdering());
+ builder.append(", state=");
+ builder.append(filter.getState());
+ builder.append("]");
+ return builder.toString();
+ }
+
+}
\ No newline at end of file
diff --git a/bundles/org.openhab.persistence.jdbc/src/main/resources/OH-INF/config/config.xml b/bundles/org.openhab.persistence.jdbc/src/main/resources/OH-INF/config/config.xml
new file mode 100644
index 0000000000000..7fb293354b526
--- /dev/null
+++ b/bundles/org.openhab.persistence.jdbc/src/main/resources/OH-INF/config/config.xml
@@ -0,0 +1,207 @@
+
+
+
+
+
+
+
+
+
+ Database URL
+
+ Required database url like 'jdbc::[:;]'
+ Parameter 'service' is used as identifier for the selected jdbc driver.
+ URL-Examples:
+ jdbc:derby:./testDerby;create=true
+ jdbc:h2:./testH2
+ jdbc:hsqldb:./testHsqlDb
+ jdbc:mariadb://192.168.0.1:3306/testMariadb
+ jdbc:mysql://192.168.0.1:3306/testMysql
+ jdbc:postgresql://192.168.0.1:5432/testPostgresql
+ jdbc:sqlite:./testSqlite.db]]>
+
+
+
+ Database User
+
+
+
+
+ Database Password
+
+
+
+
+
+ SqlType CALL
+ (optional, default: "VARCHAR(200)").
+ General about JdbcTypes/SqlTypes see: https://mybatis.github.io/mybatis-3/apidocs/reference/org/apache/ibatis/type/JdbcType.html
+ see: http://www.h2database.com/html/datatypes.html
+ see: http://www.postgresql.org/docs/9.5/static/datatype.html]]>
+
+
+ SqlType COLOR
+ (optional, default: "VARCHAR(70)").]]>
+
+
+ SqlType CONTACT
+ (optional, default: "VARCHAR(6)").]]>
+
+
+ SqlType DATETIME
+ (optional, default: "DATETIME").]]>
+
+
+ SqlType DIMMER
+ (optional, default: "TINYINT").]]>
+
+
+ SqlType LOCATION
+ (optional, default: "VARCHAR(30)").]]>
+
+
+ SqlType NUMBER
+ (optional, default: "DOUBLE").]]>
+
+
+ SqlType ROLLERSHUTTER
+ (optional, default: "TINYINT").]]>
+
+
+ SqlType STRING
+ (optional, default: "VARCHAR(65500)").]]>
+
+
+ SqlType SWITCH
+ (optional, default: "VARCHAR(6)").]]>
+
+
+
+
+ Tablename Prefix String
+ (optional, default: "item").
+ For migration from MYSQL-Bundle set to 'Item'.]]>
+
+
+ Tablename Realname Generation
+ (optional, default: disabled -> "Tablename Prefix String" is used).
+ If true, 'Tablename Prefix String' is ignored.]]>
+
+ Enable
+ Disable
+
+
+
+ Tablename Suffix ID Count
+ (optional, default: 4 -> 0001-9999).
+ For migration from MYSQL-Bundle set to 0.]]>
+
+
+ Tablename Rebuild
+
+ USE WITH CARE! Deactivate after renaming is done!]]>
+
+ Enable
+ Disable
+
+
+
+
+
+ Connections Max Pool Size
+ (optional, default: differs each Database)
+ https://github.com/brettwooldridge/HikariCP/issues/256]]>
+
+
+ Connections Min Idle
+ (optional, default: differs each Database)
+ https://github.com/brettwooldridge/HikariCP/issues/256]]>
+
+
+
+
+ Timekeeping Enable
+ (optional, default: disabled)]]>
+
+ Enable
+ Disable
+
+
+
+
+
+
diff --git a/bundles/org.openhab.persistence.jpa/.classpath b/bundles/org.openhab.persistence.jpa/.classpath
new file mode 100644
index 0000000000000..15a6560a73f29
--- /dev/null
+++ b/bundles/org.openhab.persistence.jpa/.classpath
@@ -0,0 +1,32 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/bundles/org.openhab.persistence.jpa/.project b/bundles/org.openhab.persistence.jpa/.project
new file mode 100644
index 0000000000000..3242a68d79847
--- /dev/null
+++ b/bundles/org.openhab.persistence.jpa/.project
@@ -0,0 +1,23 @@
+
+
+ org.openhab.persistence.jpa
+
+
+
+
+
+ org.eclipse.jdt.core.javabuilder
+
+
+
+
+ org.eclipse.m2e.core.maven2Builder
+
+
+
+
+
+ org.eclipse.jdt.core.javanature
+ org.eclipse.m2e.core.maven2Nature
+
+
diff --git a/bundles/org.openhab.persistence.jpa/.settings/org.eclipse.core.resources.prefs b/bundles/org.openhab.persistence.jpa/.settings/org.eclipse.core.resources.prefs
new file mode 100644
index 0000000000000..839d647eef851
--- /dev/null
+++ b/bundles/org.openhab.persistence.jpa/.settings/org.eclipse.core.resources.prefs
@@ -0,0 +1,5 @@
+eclipse.preferences.version=1
+encoding//src/main/java=UTF-8
+encoding//src/main/resources=UTF-8
+encoding//src/test/java=UTF-8
+encoding/=UTF-8
diff --git a/bundles/org.openhab.persistence.jpa/.settings/org.eclipse.jdt.core.prefs b/bundles/org.openhab.persistence.jpa/.settings/org.eclipse.jdt.core.prefs
new file mode 100644
index 0000000000000..2af1e7b99c98d
--- /dev/null
+++ b/bundles/org.openhab.persistence.jpa/.settings/org.eclipse.jdt.core.prefs
@@ -0,0 +1,8 @@
+eclipse.preferences.version=1
+org.eclipse.jdt.core.compiler.codegen.targetPlatform=11
+org.eclipse.jdt.core.compiler.compliance=11
+org.eclipse.jdt.core.compiler.problem.enablePreviewFeatures=disabled
+org.eclipse.jdt.core.compiler.problem.forbiddenReference=warning
+org.eclipse.jdt.core.compiler.problem.reportPreviewFeatures=ignore
+org.eclipse.jdt.core.compiler.release=disabled
+org.eclipse.jdt.core.compiler.source=11
diff --git a/bundles/org.openhab.persistence.jpa/.settings/org.eclipse.jdt.ui.prefs b/bundles/org.openhab.persistence.jpa/.settings/org.eclipse.jdt.ui.prefs
new file mode 100644
index 0000000000000..fe89f28bca590
--- /dev/null
+++ b/bundles/org.openhab.persistence.jpa/.settings/org.eclipse.jdt.ui.prefs
@@ -0,0 +1,2 @@
+eclipse.preferences.version=1
+formatter_settings_version=12
diff --git a/bundles/org.openhab.persistence.jpa/.settings/org.eclipse.m2e.core.prefs b/bundles/org.openhab.persistence.jpa/.settings/org.eclipse.m2e.core.prefs
new file mode 100644
index 0000000000000..f897a7f1cb238
--- /dev/null
+++ b/bundles/org.openhab.persistence.jpa/.settings/org.eclipse.m2e.core.prefs
@@ -0,0 +1,4 @@
+activeProfiles=
+eclipse.preferences.version=1
+resolveWorkspaceProjects=true
+version=1
diff --git a/bundles/org.openhab.persistence.jpa/.settings/org.eclipse.pde.core.prefs b/bundles/org.openhab.persistence.jpa/.settings/org.eclipse.pde.core.prefs
new file mode 100644
index 0000000000000..f29e940a0059c
--- /dev/null
+++ b/bundles/org.openhab.persistence.jpa/.settings/org.eclipse.pde.core.prefs
@@ -0,0 +1,3 @@
+eclipse.preferences.version=1
+pluginProject.extensions=false
+resolve.requirebundle=false
diff --git a/bundles/org.openhab.persistence.jpa/NOTICE b/bundles/org.openhab.persistence.jpa/NOTICE
new file mode 100644
index 0000000000000..6c17d0d8a455b
--- /dev/null
+++ b/bundles/org.openhab.persistence.jpa/NOTICE
@@ -0,0 +1,14 @@
+This content is produced and maintained by the openHAB project.
+
+* Project home: https://www.openhab.org
+
+== Declared Project Licenses
+
+This program and the accompanying materials are made available under the terms
+of the Eclipse Public License 2.0 which is available at
+https://www.eclipse.org/legal/epl-2.0/.
+
+== Source Code
+
+https://github.com/openhab/openhab-core
+
diff --git a/bundles/org.openhab.persistence.jpa/README.md b/bundles/org.openhab.persistence.jpa/README.md
new file mode 100644
index 0000000000000..67b34f68c42bf
--- /dev/null
+++ b/bundles/org.openhab.persistence.jpa/README.md
@@ -0,0 +1,39 @@
+# Java Persistence API (JPA) Persistence
+
+This service allows you to persist state updates using a SQL or NoSQL database through the [Java Persistence API](https://en.wikipedia.org/wiki/Java_Persistence_API). The service uses an abstraction layer that theoretically allows it to support many available SQL or NoSQL databases.
+
+It will create one table named `historic_item` where all item states are stored. The item state is stored in a string representation.
+
+The service currently supports MySQL, Apache Derby and PostgreSQL databases. Only the embedded Apache Derby database driver is included. Other drivers must be installed manually. (See below for more information on that.)
+
+## Configuration
+
+This service can be configured in the file `services/jpa.cfg`.
+
+| Property | Default | Required | Description |
+|----------|---------|:--------:|-------------|
+| url | | Yes | JDBC connection URL. Examples: `jdbc:postgresql://hab.local:5432/openhab` `jdbc:derby://hab.local:1527/openhab;create=true` `jdbc:mysql://localhost:3306/openhab` |
+| driver | | Yes | database driver. Examples: `org.postgresql.Driver` `org.apache.derby.jdbc.ClientDriver` `com.mysql.jdbc.Driver` Only the Apache Derby driver is included with the service. Drivers for other databases must be installed manually. This is a trivial process. Normally JDBC database drivers are packaged as OSGi bundles and can just be dropped into the `addons` folder. This has the advantage that users can update their drivers as needed. The following database drivers are known to work: `postgresql-9.4-1203-jdbc41.jar` `postgresql-9.4-1206-jdbc41.jar` |
+| user | | if needed | database user name for connection |
+| password | | if needed | database user password for connection |
+
+## Adding support for other JPA supported databases
+
+All item- and event-related configuration is done in the file `persistence/jpa.persist`.
+
+If a database driver is not an OSGi bundle, the technique below can be used to extend the openHAB classpath.
+
+Other database drivers can be added by expanding the openHAB classpath.
+
+Use the following classpath setup in start.sh / start_debug.sh of openHAB:
+
+```
+cp=$(echo lib/*.jar | tr ' ' ':'):$(find $eclipsehome -name "org.eclipse.equinox.launcher_*.jar" | sort | tail -1);
+```
+
+This will add all .jar files in a folder "lib" in the root of openhab.
+
+All databases that are supported by JPA can be added.
+
+Define `driver` and `url` according to the database definitions.
+
diff --git a/bundles/org.openhab.persistence.jpa/pom.xml b/bundles/org.openhab.persistence.jpa/pom.xml
new file mode 100644
index 0000000000000..89d931393f7ca
--- /dev/null
+++ b/bundles/org.openhab.persistence.jpa/pom.xml
@@ -0,0 +1,70 @@
+
+
+
+ 4.0.0
+
+
+ org.openhab.addons.bundles
+ org.openhab.addons.reactor.bundles
+ 3.0.0-SNAPSHOT
+
+
+ org.openhab.persistence.jpa
+
+ openHAB Add-ons :: Bundles :: Persistence Service :: JPA
+
+
+ !com.ibm.*,!com.sun.*,!oracle.*,!org.apache.bval.*,!org.apache.geronimo.*,!org.apache.avalon.*,!org.apache.log,!org.apache.tools.*,!org.apache.xerces.*,!org.jboss.*,!org.postgresql.*,!org.slf4j.impl,!weblogic.*,!javax.rmi
+
+
+
+
+
+ org.apache.openjpa
+ openjpa-all
+ 2.4.0
+
+
+
+ org.apache.derby
+ derby
+ 10.11.1.1
+ test
+
+
+
+
+
+
+ org.apache.openjpa
+ openjpa-maven-plugin
+ 3.1.0
+
+ org/apache/bval/**
+ **/model/*.class
+ true
+ true
+
+
+
+ org.apache.openjpa
+ openjpa
+
+ 3.1.0
+
+
+
+
+ enhancer
+
+ enhance
+
+ process-classes
+
+
+
+
+
+
+
diff --git a/bundles/org.openhab.persistence.jpa/src/main/feature/feature.xml b/bundles/org.openhab.persistence.jpa/src/main/feature/feature.xml
new file mode 100644
index 0000000000000..7e5a879b11b87
--- /dev/null
+++ b/bundles/org.openhab.persistence.jpa/src/main/feature/feature.xml
@@ -0,0 +1,11 @@
+
+
+ mvn:org.openhab.core.features.karaf/org.openhab.core.features.karaf.openhab-core/${ohc.version}/xml/features
+
+
+ openhab-runtime-base
+ mvn:org.openhab.addons.bundles/org.openhab.persistence.jpa/${project.version}
+ mvn:${project.groupId}/openhab-addons-external3/${project.version}/cfg/jpa
+
+
+
diff --git a/bundles/org.openhab.persistence.jpa/src/main/java/org/openhab/persistence/jpa/internal/JpaConfiguration.java b/bundles/org.openhab.persistence.jpa/src/main/java/org/openhab/persistence/jpa/internal/JpaConfiguration.java
new file mode 100644
index 0000000000000..45760ef6983fe
--- /dev/null
+++ b/bundles/org.openhab.persistence.jpa/src/main/java/org/openhab/persistence/jpa/internal/JpaConfiguration.java
@@ -0,0 +1,87 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jpa.internal;
+
+import java.util.Map;
+
+import org.apache.commons.lang.StringUtils;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * The configuration required for Jpa binding.
+ *
+ * @author Manfred Bergmann - Initial contribution
+ * @author Kai Kreuzer - migrated to 3.x
+ *
+ */
+public class JpaConfiguration {
+ private static final Logger logger = LoggerFactory.getLogger(JpaConfiguration.class);
+
+ private static final String CFG_CONNECTION_URL = "url";
+ private static final String CFG_DRIVER_CLASS = "driver";
+ private static final String CFG_USERNAME = "user";
+ private static final String CFG_PASSWORD = "password";
+ private static final String CFG_SYNCMAPPING = "syncmappings";
+
+ public static boolean isInitialized = false;
+
+ public final String dbConnectionUrl;
+ public final String dbDriverClass;
+ public final String dbUserName;
+ public final String dbPassword;
+ public final String dbSyncMapping;
+
+ public JpaConfiguration(final Map properties) {
+ logger.debug("Update config...");
+
+ String param = (String) properties.get(CFG_CONNECTION_URL);
+ logger.debug("url: {}", param);
+ if (param == null) {
+ logger.warn("Connection url is required in jpa.cfg!");
+ }
+ if (StringUtils.isBlank(param)) {
+ logger.warn("Empty connection url in jpa.cfg!");
+ }
+ dbConnectionUrl = param;
+
+ param = (String) properties.get(CFG_DRIVER_CLASS);
+ logger.debug("driver: {}", param);
+ if (param == null) {
+ logger.warn("Driver class is required in jpa.cfg!");
+ }
+ if (StringUtils.isBlank(param)) {
+ logger.warn("Empty driver class in jpa.cfg!");
+ }
+ dbDriverClass = param;
+
+ if (properties.get(CFG_USERNAME) == null) {
+ logger.info("{} was not specified!", CFG_USERNAME);
+ }
+ dbUserName = (String) properties.get(CFG_USERNAME);
+
+ if (properties.get(CFG_PASSWORD) == null) {
+ logger.info("{} was not specified!", CFG_PASSWORD);
+ }
+ dbPassword = (String) properties.get(CFG_PASSWORD);
+
+ if (properties.get(CFG_SYNCMAPPING) == null) {
+ logger.debug("{} was not specified!", CFG_SYNCMAPPING);
+ }
+ dbSyncMapping = (String) properties.get(CFG_SYNCMAPPING);
+
+ isInitialized = true;
+ logger.debug("Update config... done");
+ }
+
+}
diff --git a/bundles/org.openhab.persistence.jpa/src/main/java/org/openhab/persistence/jpa/internal/JpaHistoricItem.java b/bundles/org.openhab.persistence.jpa/src/main/java/org/openhab/persistence/jpa/internal/JpaHistoricItem.java
new file mode 100644
index 0000000000000..b8080d8865f45
--- /dev/null
+++ b/bundles/org.openhab.persistence.jpa/src/main/java/org/openhab/persistence/jpa/internal/JpaHistoricItem.java
@@ -0,0 +1,138 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jpa.internal;
+
+import java.text.DateFormat;
+import java.util.ArrayList;
+import java.util.Calendar;
+import java.util.Date;
+import java.util.List;
+
+import org.openhab.core.items.Item;
+import org.openhab.core.library.items.ContactItem;
+import org.openhab.core.library.items.DateTimeItem;
+import org.openhab.core.library.items.DimmerItem;
+import org.openhab.core.library.items.LocationItem;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.core.library.items.RollershutterItem;
+import org.openhab.core.library.items.SwitchItem;
+import org.openhab.core.library.types.DateTimeType;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.library.types.OpenClosedType;
+import org.openhab.core.library.types.PercentType;
+import org.openhab.core.library.types.PointType;
+import org.openhab.core.library.types.StringListType;
+import org.openhab.core.library.types.StringType;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.types.State;
+import org.openhab.persistence.jpa.internal.model.JpaPersistentItem;
+
+/**
+ * The historic item as returned when querying the service.
+ *
+ * @author Manfred Bergmann - Initial contribution
+ *
+ */
+public class JpaHistoricItem implements HistoricItem {
+
+ final private String name;
+ final private State state;
+ final private Date timestamp;
+
+ public JpaHistoricItem(String name, State state, Date timestamp) {
+ this.name = name;
+ this.state = state;
+ this.timestamp = timestamp;
+ }
+
+ @Override
+ public String getName() {
+ return name;
+ }
+
+ @Override
+ public Date getTimestamp() {
+ return timestamp;
+ }
+
+ @Override
+ public State getState() {
+ return state;
+ }
+
+ @Override
+ public String toString() {
+ return DateFormat.getDateTimeInstance().format(timestamp) + ": " + name + " -> " + state.toString();
+ }
+
+ /**
+ * This method maps a jpa result item to this historic item.
+ *
+ * @param jpaQueryResult the result which jpa items
+ * @param item used for query information, like the state (State)
+ * @return list of historic items
+ */
+ public static List fromResultList(List jpaQueryResult, Item item) {
+ List ret = new ArrayList();
+ for (JpaPersistentItem i : jpaQueryResult) {
+ HistoricItem hi = fromPersistedItem(i, item);
+ ret.add(hi);
+ }
+ return ret;
+ }
+
+ /**
+ * Converts the string value of the persisted item to the state of a HistoricItem.
+ *
+ * @param pItem the persisted JpaPersistentItem
+ * @param item the source reference Item
+ * @return historic item
+ */
+ public static HistoricItem fromPersistedItem(JpaPersistentItem pItem, Item item) {
+ State state;
+ if (item instanceof NumberItem) {
+ state = new DecimalType(Double.valueOf(pItem.getValue()));
+ } else if (item instanceof DimmerItem) {
+ state = new PercentType(Integer.valueOf(pItem.getValue()));
+ } else if (item instanceof SwitchItem) {
+ state = OnOffType.valueOf(pItem.getValue());
+ } else if (item instanceof ContactItem) {
+ state = OpenClosedType.valueOf(pItem.getValue());
+ } else if (item instanceof RollershutterItem) {
+ state = PercentType.valueOf(pItem.getValue());
+ } else if (item instanceof DateTimeItem) {
+ Calendar cal = Calendar.getInstance();
+ cal.setTime(new Date(Long.valueOf(pItem.getValue())));
+ state = new DateTimeType(cal);
+ } else if (item instanceof LocationItem) {
+ PointType pType = null;
+ String[] comps = pItem.getValue().split(";");
+ if (comps.length >= 2) {
+ pType = new PointType(new DecimalType(comps[0]), new DecimalType(comps[1]));
+
+ if (comps.length == 3) {
+ pType.setAltitude(new DecimalType(comps[2]));
+ }
+ }
+ state = pType;
+
+ } else if (item instanceof StringListType) {
+ state = new StringListType(pItem.getValue());
+ } else {
+ state = new StringType(pItem.getValue());
+ }
+
+ return new JpaHistoricItem(item.getName(), state, pItem.getTimestamp());
+ }
+}
diff --git a/bundles/org.openhab.persistence.jpa/src/main/java/org/openhab/persistence/jpa/internal/JpaPersistenceService.java b/bundles/org.openhab.persistence.jpa/src/main/java/org/openhab/persistence/jpa/internal/JpaPersistenceService.java
new file mode 100644
index 0000000000000..cd14af2d0d95d
--- /dev/null
+++ b/bundles/org.openhab.persistence.jpa/src/main/java/org/openhab/persistence/jpa/internal/JpaPersistenceService.java
@@ -0,0 +1,326 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jpa.internal;
+
+import java.util.Collections;
+import java.util.Date;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Locale;
+import java.util.Map;
+import java.util.Set;
+
+import javax.persistence.EntityManager;
+import javax.persistence.EntityManagerFactory;
+import javax.persistence.Persistence;
+import javax.persistence.Query;
+
+import org.apache.commons.lang.StringUtils;
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.core.items.Item;
+import org.openhab.core.items.ItemNotFoundException;
+import org.openhab.core.items.ItemRegistry;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.FilterCriteria.Ordering;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.persistence.PersistenceItemInfo;
+import org.openhab.core.persistence.PersistenceService;
+import org.openhab.core.persistence.QueryablePersistenceService;
+import org.openhab.core.persistence.strategy.PersistenceStrategy;
+import org.openhab.core.types.UnDefType;
+import org.openhab.persistence.jpa.internal.model.JpaPersistentItem;
+import org.osgi.framework.BundleContext;
+import org.osgi.service.component.annotations.Activate;
+import org.osgi.service.component.annotations.Component;
+import org.osgi.service.component.annotations.ConfigurationPolicy;
+import org.osgi.service.component.annotations.Deactivate;
+import org.osgi.service.component.annotations.Reference;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * JPA based implementation of QueryablePersistenceService.
+ *
+ * @author Manfred Bergmann - Initial contribution
+ */
+@NonNullByDefault
+@Component(service = { PersistenceService.class,
+ QueryablePersistenceService.class }, configurationPid = "org.openhab.jpa", configurationPolicy = ConfigurationPolicy.REQUIRE)
+public class JpaPersistenceService implements QueryablePersistenceService {
+ private static final Logger logger = LoggerFactory.getLogger(JpaPersistenceService.class);
+
+ @Reference
+ protected @NonNullByDefault({}) ItemRegistry itemRegistry;
+
+ private @Nullable EntityManagerFactory emf = null;
+
+ private @NonNullByDefault({}) JpaConfiguration config;
+
+ /**
+ * lazy loading because update() is called after activate()
+ *
+ * @return EntityManagerFactory
+ */
+ protected @Nullable EntityManagerFactory getEntityManagerFactory() {
+ if (emf == null) {
+ emf = newEntityManagerFactory();
+ }
+ return emf;
+ }
+
+ @Activate
+ public void activate(BundleContext context, Map properties) {
+ logger.debug("Activating jpa persistence service");
+ config = new JpaConfiguration(properties);
+ }
+
+ /**
+ * Closes the EntityPersistenceFactory
+ */
+ @Deactivate
+ public void deactivate() {
+ logger.debug("Deactivating jpa persistence service");
+ closeEntityManagerFactory();
+ }
+
+ @Override
+ public String getId() {
+ return "jpa";
+ }
+
+ @Override
+ public String getLabel(@Nullable Locale locale) {
+ return "JPA";
+ }
+
+ @Override
+ public void store(Item item) {
+ store(item, null);
+ }
+
+ @Override
+ public void store(Item item, @Nullable String alias) {
+ logger.debug("Storing item: {}", item.getName());
+
+ if (item.getState() instanceof UnDefType) {
+ logger.debug("This item is of undefined type. Cannot persist it!");
+ return;
+ }
+
+ if (!JpaConfiguration.isInitialized) {
+ logger.debug("Trying to create EntityManagerFactory but we don't have configuration yet!");
+ return;
+ }
+
+ // determine item name to be stored
+ String name = (alias != null) ? alias : item.getName();
+
+ JpaPersistentItem pItem = new JpaPersistentItem();
+ try {
+ String newValue = StateHelper.toString(item.getState());
+ pItem.setValue(newValue);
+ logger.debug("Stored new value: {}", newValue);
+ } catch (Exception e1) {
+ logger.error("Error on converting state value to string: {}", e1.getMessage());
+ return;
+ }
+ pItem.setName(name);
+ pItem.setRealName(item.getName());
+ pItem.setTimestamp(new Date());
+
+ EntityManager em = getEntityManagerFactory().createEntityManager();
+ try {
+ logger.debug("Persisting item...");
+ // In RESOURCE_LOCAL calls to EntityManager require a begin/commit
+ em.getTransaction().begin();
+ em.persist(pItem);
+ em.getTransaction().commit();
+ logger.debug("Persisting item...done");
+ } catch (Exception e) {
+ logger.error("Error on persisting item! Rolling back!", e);
+ em.getTransaction().rollback();
+ } finally {
+ em.close();
+ }
+
+ logger.debug("Storing item...done");
+ }
+
+ @Override
+ public Set getItemInfo() {
+ return Collections.emptySet();
+ }
+
+ @Override
+ public Iterable query(FilterCriteria filter) {
+ logger.debug("Querying for historic item: {}", filter.getItemName());
+
+ if (!JpaConfiguration.isInitialized) {
+ logger.warn("Trying to create EntityManagerFactory but we don't have configuration yet!");
+ return Collections.emptyList();
+ }
+
+ String itemName = filter.getItemName();
+ Item item = getItemFromRegistry(itemName);
+
+ String sortOrder;
+ if (filter.getOrdering() == Ordering.ASCENDING) {
+ sortOrder = "ASC";
+ } else {
+ sortOrder = "DESC";
+ }
+
+ boolean hasBeginDate = false;
+ boolean hasEndDate = false;
+ String queryString = "SELECT n FROM " + JpaPersistentItem.class.getSimpleName()
+ + " n WHERE n.realName = :itemName";
+ if (filter.getBeginDate() != null) {
+ queryString += " AND n.timestamp >= :beginDate";
+ hasBeginDate = true;
+ }
+ if (filter.getEndDate() != null) {
+ queryString += " AND n.timestamp <= :endDate";
+ hasEndDate = true;
+ }
+ queryString += " ORDER BY n.timestamp " + sortOrder;
+
+ logger.debug("The query: {}", queryString);
+
+ EntityManager em = getEntityManagerFactory().createEntityManager();
+ try {
+ // In RESOURCE_LOCAL calls to EntityManager require a begin/commit
+ em.getTransaction().begin();
+
+ logger.debug("Creating query...");
+ Query query = em.createQuery(queryString);
+ query.setParameter("itemName", item.getName());
+ if (hasBeginDate) {
+ query.setParameter("beginDate", filter.getBeginDate());
+ }
+ if (hasEndDate) {
+ query.setParameter("endDate", filter.getEndDate());
+ }
+
+ query.setFirstResult(filter.getPageNumber() * filter.getPageSize());
+ query.setMaxResults(filter.getPageSize());
+ logger.debug("Creating query...done");
+
+ logger.debug("Retrieving result list...");
+ @SuppressWarnings("unchecked")
+ List result = query.getResultList();
+ logger.debug("Retrieving result list...done");
+
+ List historicList = JpaHistoricItem.fromResultList(result, item);
+ logger.debug("{}", String.format("Convert to HistoricItem: %d", historicList.size()));
+
+ em.getTransaction().commit();
+
+ return historicList;
+ } catch (Exception e) {
+ logger.error("Error on querying database!", e);
+ em.getTransaction().rollback();
+
+ } finally {
+ em.close();
+ }
+
+ return Collections.emptyList();
+ }
+
+ /**
+ * Creates a new EntityManagerFactory with properties read from openhab.cfg via JpaConfiguration.
+ *
+ * @return initialized EntityManagerFactory
+ */
+ protected EntityManagerFactory newEntityManagerFactory() {
+ logger.trace("Creating EntityManagerFactory...");
+
+ Map properties = new HashMap();
+ properties.put("javax.persistence.jdbc.url", config.dbConnectionUrl);
+ properties.put("javax.persistence.jdbc.driver", config.dbDriverClass);
+ if (config.dbUserName != null) {
+ properties.put("javax.persistence.jdbc.user", config.dbUserName);
+ }
+ if (config.dbPassword != null) {
+ properties.put("javax.persistence.jdbc.password", config.dbPassword);
+ }
+ if (config.dbUserName != null && config.dbPassword == null) {
+ logger.warn("JPA persistence - it is recommended to use a password to protect data store");
+ }
+ if (config.dbSyncMapping != null && !StringUtils.isBlank(config.dbSyncMapping)) {
+ logger.warn("You are settings openjpa.jdbc.SynchronizeMappings, I hope you know what you're doing!");
+ properties.put("openjpa.jdbc.SynchronizeMappings", config.dbSyncMapping);
+ }
+
+ EntityManagerFactory fac = Persistence.createEntityManagerFactory(getPersistenceUnitName(), properties);
+ logger.debug("Creating EntityManagerFactory...done");
+
+ return fac;
+ }
+
+ /**
+ * Closes EntityManagerFactory
+ */
+ protected void closeEntityManagerFactory() {
+ if (emf != null) {
+ emf.close();
+ emf = null;
+ }
+ logger.debug("Closing down entity objects...done");
+ }
+
+ /**
+ * Checks if EntityManagerFactory is open
+ *
+ * @return true when open, false otherwise
+ */
+ protected boolean isEntityManagerFactoryOpen() {
+ return emf != null && emf.isOpen();
+ }
+
+ /**
+ * Return the persistence unit as in persistence.xml file.
+ *
+ * @return the persistence unit name
+ */
+ protected String getPersistenceUnitName() {
+ return "default";
+ }
+
+ /**
+ * Retrieves the item for the given name from the item registry
+ *
+ * @param itemName
+ * @return item
+ */
+ private @Nullable Item getItemFromRegistry(String itemName) {
+ Item item = null;
+ try {
+ if (itemRegistry != null) {
+ item = itemRegistry.getItem(itemName);
+ }
+ } catch (ItemNotFoundException e1) {
+ logger.error("Unable to get item type for {}", itemName);
+ // Set type to null - data will be returned as StringType
+ item = null;
+ }
+ return item;
+ }
+
+ @Override
+ public List getDefaultStrategies() {
+ return Collections.emptyList();
+ }
+
+}
diff --git a/bundles/org.openhab.persistence.jpa/src/main/java/org/openhab/persistence/jpa/internal/StateHelper.java b/bundles/org.openhab.persistence.jpa/src/main/java/org/openhab/persistence/jpa/internal/StateHelper.java
new file mode 100644
index 0000000000000..cb73dcb8f225a
--- /dev/null
+++ b/bundles/org.openhab.persistence.jpa/src/main/java/org/openhab/persistence/jpa/internal/StateHelper.java
@@ -0,0 +1,52 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jpa.internal;
+
+import java.util.Locale;
+
+import org.openhab.core.library.types.DateTimeType;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.library.types.PointType;
+import org.openhab.core.types.State;
+
+/**
+ * Helper class for dealing with State
+ *
+ * @author Manfred Bergmann - Initial contribution
+ *
+ */
+public class StateHelper {
+
+ /**
+ * Converts the given State to a string that can be persisted in db
+ *
+ * @param state the state of the item to be persisted
+ * @return state converted as string
+ * @throws Exception
+ */
+ static public String toString(State state) throws Exception {
+ if (state instanceof DateTimeType) {
+ return String.valueOf(((DateTimeType) state).getCalendar().getTime().getTime());
+ }
+ if (state instanceof DecimalType) {
+ return String.valueOf(((DecimalType) state).doubleValue());
+ }
+ if (state instanceof PointType) {
+ PointType pType = (PointType) state;
+ return String.format(Locale.ENGLISH, "%f;%f;%f", pType.getLatitude().doubleValue(),
+ pType.getLongitude().doubleValue(), pType.getAltitude().doubleValue());
+ }
+
+ return state.toString();
+ }
+}
diff --git a/bundles/org.openhab.persistence.jpa/src/main/java/org/openhab/persistence/jpa/internal/model/JpaPersistentItem.java b/bundles/org.openhab.persistence.jpa/src/main/java/org/openhab/persistence/jpa/internal/model/JpaPersistentItem.java
new file mode 100644
index 0000000000000..ddebbb62d983c
--- /dev/null
+++ b/bundles/org.openhab.persistence.jpa/src/main/java/org/openhab/persistence/jpa/internal/model/JpaPersistentItem.java
@@ -0,0 +1,104 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jpa.internal.model;
+
+import java.text.DateFormat;
+import java.util.Date;
+
+import javax.persistence.Column;
+import javax.persistence.Entity;
+import javax.persistence.GeneratedValue;
+import javax.persistence.GenerationType;
+import javax.persistence.Id;
+import javax.persistence.Table;
+import javax.persistence.Temporal;
+import javax.persistence.TemporalType;
+
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.types.State;
+import org.openhab.core.types.UnDefType;
+
+/**
+ * This is the DAO object used for storing and retrieving to and from database.
+ *
+ * @author Manfred Bergmann - Initial contribution
+ *
+ */
+
+@Entity
+@Table(name = "HISTORIC_ITEM")
+public class JpaPersistentItem implements HistoricItem {
+
+ @Id
+ @GeneratedValue(strategy = GenerationType.AUTO)
+ private Long id;
+
+ private String name = "";
+ private String realName = "";
+ @Temporal(TemporalType.TIMESTAMP)
+ private Date timestamp = new Date();
+ @Column(length = 32672) // 32k, max varchar for apache derby
+ private String value = "";
+
+ public Long getId() {
+ return id;
+ }
+
+ public void setId(Long id) {
+ this.id = id;
+ }
+
+ @Override
+ public String getName() {
+ return name;
+ }
+
+ public void setName(String name) {
+ this.name = name;
+ }
+
+ public String getRealName() {
+ return realName;
+ }
+
+ public void setRealName(String realName) {
+ this.realName = realName;
+ }
+
+ @Override
+ public Date getTimestamp() {
+ return timestamp;
+ }
+
+ public void setTimestamp(Date timestamp) {
+ this.timestamp = timestamp;
+ }
+
+ public String getValue() {
+ return value;
+ }
+
+ public void setValue(String value) {
+ this.value = value;
+ }
+
+ @Override
+ public State getState() {
+ return UnDefType.NULL;
+ }
+
+ @Override
+ public String toString() {
+ return DateFormat.getDateTimeInstance().format(getTimestamp()) + ": " + getName() + " -> " + value;
+ }
+}
diff --git a/bundles/org.openhab.persistence.jpa/src/main/resources/META-INF/persistence.xml b/bundles/org.openhab.persistence.jpa/src/main/resources/META-INF/persistence.xml
new file mode 100644
index 0000000000000..98de7665ff71b
--- /dev/null
+++ b/bundles/org.openhab.persistence.jpa/src/main/resources/META-INF/persistence.xml
@@ -0,0 +1,34 @@
+
+
+
+
+ org.apache.openjpa.persistence.PersistenceProviderImpl
+
+ org.openhab.persistence.jpa.internal.model.JpaPersistentItem
+ true
+
+
+
+
+
+
+
+
+
+
+
+ org.apache.openjpa.persistence.PersistenceProviderImpl
+
+ org.openhab.persistence.jpa.internal.model.JpaPersistentItem
+ true
+
+
+
+
+
+
+
+
+
+
+
diff --git a/bundles/org.openhab.persistence.mapdb/.classpath b/bundles/org.openhab.persistence.mapdb/.classpath
new file mode 100644
index 0000000000000..19368e503c1f5
--- /dev/null
+++ b/bundles/org.openhab.persistence.mapdb/.classpath
@@ -0,0 +1,27 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/bundles/org.openhab.persistence.mapdb/.project b/bundles/org.openhab.persistence.mapdb/.project
new file mode 100644
index 0000000000000..a9881555da8ea
--- /dev/null
+++ b/bundles/org.openhab.persistence.mapdb/.project
@@ -0,0 +1,23 @@
+
+
+ org.openhab.persistence.mapdb
+
+
+
+
+
+ org.eclipse.jdt.core.javabuilder
+
+
+
+
+ org.eclipse.m2e.core.maven2Builder
+
+
+
+
+
+ org.eclipse.jdt.core.javanature
+ org.eclipse.m2e.core.maven2Nature
+
+
diff --git a/bundles/org.openhab.persistence.mapdb/.settings/org.eclipse.jdt.core.prefs b/bundles/org.openhab.persistence.mapdb/.settings/org.eclipse.jdt.core.prefs
new file mode 100644
index 0000000000000..29fe717b8794b
--- /dev/null
+++ b/bundles/org.openhab.persistence.mapdb/.settings/org.eclipse.jdt.core.prefs
@@ -0,0 +1,11 @@
+eclipse.preferences.version=1
+org.eclipse.jdt.core.compiler.codegen.inlineJsrBytecode=enabled
+org.eclipse.jdt.core.compiler.codegen.targetPlatform=11
+org.eclipse.jdt.core.compiler.compliance=11
+org.eclipse.jdt.core.compiler.problem.assertIdentifier=error
+org.eclipse.jdt.core.compiler.problem.enablePreviewFeatures=disabled
+org.eclipse.jdt.core.compiler.problem.enumIdentifier=error
+org.eclipse.jdt.core.compiler.problem.forbiddenReference=warning
+org.eclipse.jdt.core.compiler.problem.reportPreviewFeatures=warning
+org.eclipse.jdt.core.compiler.release=disabled
+org.eclipse.jdt.core.compiler.source=11
diff --git a/bundles/org.openhab.persistence.mapdb/NOTICE b/bundles/org.openhab.persistence.mapdb/NOTICE
new file mode 100644
index 0000000000000..38d625e349232
--- /dev/null
+++ b/bundles/org.openhab.persistence.mapdb/NOTICE
@@ -0,0 +1,13 @@
+This content is produced and maintained by the openHAB project.
+
+* Project home: https://www.openhab.org
+
+== Declared Project Licenses
+
+This program and the accompanying materials are made available under the terms
+of the Eclipse Public License 2.0 which is available at
+https://www.eclipse.org/legal/epl-2.0/.
+
+== Source Code
+
+https://github.com/openhab/openhab-addons
diff --git a/bundles/org.openhab.persistence.mapdb/pom.xml b/bundles/org.openhab.persistence.mapdb/pom.xml
new file mode 100644
index 0000000000000..d4e70400d27e4
--- /dev/null
+++ b/bundles/org.openhab.persistence.mapdb/pom.xml
@@ -0,0 +1,15 @@
+
+
+ 4.0.0
+
+
+ org.openhab.addons.bundles
+ org.openhab.addons.reactor.bundles
+ 3.0.0-SNAPSHOT
+
+
+ org.openhab.persistence.mapdb
+
+ openHAB Add-ons :: Bundles :: Persistence Service :: MapDB
+
+
diff --git a/bundles/org.openhab.persistence.mapdb/src/main/feature/feature.xml b/bundles/org.openhab.persistence.mapdb/src/main/feature/feature.xml
new file mode 100644
index 0000000000000..90e8a7ab2a898
--- /dev/null
+++ b/bundles/org.openhab.persistence.mapdb/src/main/feature/feature.xml
@@ -0,0 +1,10 @@
+
+
+ mvn:org.openhab.core.features.karaf/org.openhab.core.features.karaf.openhab-core/${ohc.version}/xml/features
+
+
+ openhab-runtime-base
+ mvn:org.openhab.addons.bundles/org.openhab.persistence.mapdb/${project.version}
+
+
+
diff --git a/bundles/org.openhab.persistence.mapdb/src/main/java/org/openhab/persistence/mapdb/internal/MapDbItem.java b/bundles/org.openhab.persistence.mapdb/src/main/java/org/openhab/persistence/mapdb/internal/MapDbItem.java
new file mode 100644
index 0000000000000..78fcfd6b8ef47
--- /dev/null
+++ b/bundles/org.openhab.persistence.mapdb/src/main/java/org/openhab/persistence/mapdb/internal/MapDbItem.java
@@ -0,0 +1,90 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.mapdb.internal;
+
+import java.text.DateFormat;
+import java.util.Date;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.core.types.State;
+import org.openhab.core.types.UnDefType;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.persistence.PersistenceItemInfo;
+
+/**
+ * This is a Java bean used to persist item states with timestamps in the database.
+ *
+ * @author Jens Viebig - Initial contribution
+ *
+ */
+@NonNullByDefault
+public class MapDbItem implements HistoricItem, PersistenceItemInfo {
+
+ private String name = "";
+
+ private State state = UnDefType.NULL;
+
+ private Date timestamp = new Date(0);
+
+ @Override
+ public String getName() {
+ return name;
+ }
+
+ public void setName(String name) {
+ this.name = name;
+ }
+
+ @Override
+ public State getState() {
+ return state;
+ }
+
+ public void setState(State state) {
+ this.state = state;
+ }
+
+ @Override
+ public Date getTimestamp() {
+ return timestamp;
+ }
+
+ public void setTimestamp(Date timestamp) {
+ this.timestamp = timestamp;
+ }
+
+ @Override
+ public String toString() {
+ return DateFormat.getDateTimeInstance().format(timestamp) + ": " + name + " -> " + state.toString();
+ }
+
+ @Override
+ public @Nullable Integer getCount() {
+ return null;
+ }
+
+ @Override
+ public @Nullable Date getEarliest() {
+ return null;
+ }
+
+ @Override
+ public @Nullable Date getLatest() {
+ return null;
+ }
+
+ public boolean isValid() {
+ return name != null && state != null && timestamp != null;
+ }
+}
diff --git a/bundles/org.openhab.persistence.mapdb/src/main/java/org/openhab/persistence/mapdb/internal/MapDbPersistenceService.java b/bundles/org.openhab.persistence.mapdb/src/main/java/org/openhab/persistence/mapdb/internal/MapDbPersistenceService.java
new file mode 100644
index 0000000000000..b044eb66f198c
--- /dev/null
+++ b/bundles/org.openhab.persistence.mapdb/src/main/java/org/openhab/persistence/mapdb/internal/MapDbPersistenceService.java
@@ -0,0 +1,194 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.mapdb.internal;
+
+import java.io.File;
+import java.util.Collections;
+import java.util.Date;
+import java.util.List;
+import java.util.Locale;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ExecutorService;
+import java.util.stream.Collectors;
+import java.util.stream.Stream;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.mapdb.DB;
+import org.mapdb.DBMaker;
+import org.openhab.core.common.ThreadPoolManager;
+import org.openhab.core.config.core.ConfigConstants;
+import org.openhab.core.items.Item;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.persistence.PersistenceItemInfo;
+import org.openhab.core.persistence.PersistenceService;
+import org.openhab.core.persistence.QueryablePersistenceService;
+import org.openhab.core.persistence.strategy.PersistenceStrategy;
+import org.openhab.core.types.State;
+import org.openhab.core.types.UnDefType;
+import org.osgi.service.component.annotations.Component;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import com.google.gson.Gson;
+import com.google.gson.GsonBuilder;
+
+/**
+ * This is the implementation of the MapDB {@link PersistenceService}. To learn
+ * more about MapDB please visit their website .
+ *
+ * @author Jens Viebig - Initial contribution
+ * @author Martin Kühl - Port to 3.x
+ */
+@NonNullByDefault
+@Component(service = { PersistenceService.class, QueryablePersistenceService.class })
+public class MapDbPersistenceService implements QueryablePersistenceService {
+
+ private static final String SERVICE_ID = "mapdb";
+ private static final String SERVICE_LABEL = "MapDB";
+ private static final String DB_FOLDER_NAME = ConfigConstants.getUserDataFolder() + File.separator + "persistence"
+ + File.separator + "mapdb";
+ private static final String DB_FILE_NAME = "storage.mapdb";
+
+ private final Logger logger = LoggerFactory.getLogger(MapDbPersistenceService.class);
+
+ @NonNullByDefault({})
+ private ExecutorService threadPool;
+
+ /** holds the local instance of the MapDB database */
+ @NonNullByDefault({})
+ private DB db;
+ @NonNullByDefault({})
+ private Map map;
+
+ private transient Gson mapper = new GsonBuilder().registerTypeHierarchyAdapter(State.class, new StateTypeAdapter())
+ .create();
+
+ public void activate() {
+ logger.debug("MapDB persistence service is being activated");
+
+ threadPool = ThreadPoolManager.getPool(getClass().getSimpleName());
+
+ File folder = new File(DB_FOLDER_NAME);
+ if (!folder.exists()) {
+ if (!folder.mkdirs()) {
+ logger.warn("Failed to create one or more directories in the path '{}'", DB_FOLDER_NAME);
+ logger.warn("MapDB persistence service activation has failed.");
+ return;
+ }
+ }
+
+ File dbFile = new File(DB_FOLDER_NAME, DB_FILE_NAME);
+ db = DBMaker.newFileDB(dbFile).closeOnJvmShutdown().make();
+ map = db.createTreeMap("itemStore").makeOrGet();
+ logger.debug("MapDB persistence service is now activated");
+ }
+
+ public void deactivate() {
+ logger.debug("MapDB persistence service deactivated");
+ if (db != null) {
+ db.close();
+ }
+ threadPool.shutdown();
+ }
+
+ @Override
+ public String getId() {
+ return SERVICE_ID;
+ }
+
+ @Override
+ public String getLabel(@Nullable Locale locale) {
+ return SERVICE_LABEL;
+ }
+
+ @Override
+ public Set getItemInfo() {
+ return map.values().stream().map(this::deserialize).flatMap(MapDbPersistenceService::streamOptional)
+ .collect(Collectors. toSet());
+ }
+
+ @Override
+ public void store(Item item) {
+ store(item, item.getName());
+ }
+
+ @Override
+ public void store(Item item, @Nullable String alias) {
+ if (item.getState() instanceof UnDefType) {
+ return;
+ }
+
+ // PersistenceManager passes SimpleItemConfiguration.alias which can be null
+ if (alias == null) {
+ alias = item.getName();
+ }
+ logger.debug("store called for {}", alias);
+
+ State state = item.getState();
+ MapDbItem mItem = new MapDbItem();
+ mItem.setName(alias);
+ mItem.setState(state);
+ mItem.setTimestamp(new Date());
+ String json = serialize(mItem);
+ map.put(alias, json);
+ commit();
+ logger.debug("Stored '{}' with state '{}' in MapDB database", alias, state.toString());
+ }
+
+ @Override
+ public Iterable query(FilterCriteria filter) {
+ String json = map.get(filter.getItemName());
+ if (json == null) {
+ return Collections.emptyList();
+ }
+ Optional item = deserialize(json);
+ if (!item.isPresent()) {
+ return Collections.emptyList();
+ }
+ return Collections.singletonList(item.get());
+ }
+
+ private String serialize(MapDbItem item) {
+ return mapper.toJson(item);
+ }
+
+ private Optional deserialize(String json) {
+ MapDbItem item = mapper. fromJson(json, MapDbItem.class);
+ if (item == null || !item.isValid()) {
+ logger.warn("Deserialized invalid item: {}", item);
+ return Optional.empty();
+ }
+ return Optional.of(item);
+ }
+
+ private void commit() {
+ threadPool.submit(() -> db.commit());
+ }
+
+ private static Stream streamOptional(Optional opt) {
+ if (!opt.isPresent()) {
+ return Stream.empty();
+ }
+ return Stream.of(opt.get());
+ }
+
+ @Override
+ public List getDefaultStrategies() {
+ return List.of(PersistenceStrategy.Globals.RESTORE, PersistenceStrategy.Globals.CHANGE);
+ }
+}
diff --git a/bundles/org.openhab.persistence.mapdb/src/main/java/org/openhab/persistence/mapdb/internal/StateTypeAdapter.java b/bundles/org.openhab.persistence.mapdb/src/main/java/org/openhab/persistence/mapdb/internal/StateTypeAdapter.java
new file mode 100644
index 0000000000000..c650fcf50525b
--- /dev/null
+++ b/bundles/org.openhab.persistence.mapdb/src/main/java/org/openhab/persistence/mapdb/internal/StateTypeAdapter.java
@@ -0,0 +1,70 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.mapdb.internal;
+
+import java.io.IOException;
+import java.util.Collections;
+import java.util.List;
+
+import org.openhab.core.types.State;
+import org.openhab.core.types.TypeParser;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import com.google.gson.TypeAdapter;
+import com.google.gson.stream.JsonReader;
+import com.google.gson.stream.JsonToken;
+import com.google.gson.stream.JsonWriter;
+
+/**
+ * A GSON TypeAdapter for openHAB State values.
+ *
+ * @author Martin Kühl - Initial contribution
+ */
+public class StateTypeAdapter extends TypeAdapter {
+ private static final String TYPE_SEPARATOR = "@@@";
+
+ private final Logger logger = LoggerFactory.getLogger(StateTypeAdapter.class);
+
+ @Override
+ public State read(JsonReader reader) throws IOException {
+ if (reader.peek() == JsonToken.NULL) {
+ reader.nextNull();
+ return null;
+ }
+ String value = reader.nextString();
+ String[] parts = value.split(TYPE_SEPARATOR);
+ String valueTypeName = parts[0];
+ String valueAsString = parts[1];
+
+ try {
+ @SuppressWarnings("unchecked")
+ Class extends State> valueType = (Class extends State>) Class.forName(valueTypeName);
+ List> types = Collections.singletonList(valueType);
+ return TypeParser.parseState(types, valueAsString);
+ } catch (Exception e) {
+ logger.warn("Couldn't deserialize state '{}': {}", value, e.getMessage());
+ }
+ return null;
+ }
+
+ @Override
+ public void write(JsonWriter writer, State state) throws IOException {
+ if (state == null) {
+ writer.nullValue();
+ return;
+ }
+ String value = state.getClass().getName() + TYPE_SEPARATOR + state.toFullString();
+ writer.value(value);
+ }
+}
diff --git a/bundles/org.openhab.persistence.mapdb/src/test/java/org/openhab/persistence/mapdb/StateTypeAdapterTest.java b/bundles/org.openhab.persistence.mapdb/src/test/java/org/openhab/persistence/mapdb/StateTypeAdapterTest.java
new file mode 100644
index 0000000000000..11eb9884e5d4c
--- /dev/null
+++ b/bundles/org.openhab.persistence.mapdb/src/test/java/org/openhab/persistence/mapdb/StateTypeAdapterTest.java
@@ -0,0 +1,49 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.mapdb;
+
+import static org.hamcrest.CoreMatchers.*;
+import static org.junit.Assert.*;
+
+import org.openhab.core.library.types.HSBType;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.library.types.PercentType;
+import org.openhab.core.library.types.StringType;
+import org.openhab.core.types.State;
+import org.junit.Test;
+import org.openhab.persistence.mapdb.internal.StateTypeAdapter;
+
+import com.google.gson.Gson;
+import com.google.gson.GsonBuilder;
+
+/**
+ *
+ * @author Martin Kühl - Initial contribution
+ */
+public class StateTypeAdapterTest {
+ Gson mapper = new GsonBuilder()
+ .registerTypeHierarchyAdapter(State.class, new StateTypeAdapter())
+ .create();
+
+ @Test
+ public void readWriteRoundtripShouldRecreateTheWrittenState() {
+ assertThat(roundtrip(OnOffType.ON), is(equalTo(OnOffType.ON)));
+ assertThat(roundtrip(PercentType.HUNDRED), is(equalTo(PercentType.HUNDRED)));
+ assertThat(roundtrip(HSBType.GREEN), is(equalTo(HSBType.GREEN)));
+ assertThat(roundtrip(StringType.valueOf("test")), is(equalTo(StringType.valueOf("test"))));
+ }
+
+ private State roundtrip(State state) {
+ return mapper.fromJson(mapper.toJson(state), State.class);
+ }
+}
diff --git a/bundles/org.openhab.persistence.mongodb/.classpath b/bundles/org.openhab.persistence.mongodb/.classpath
new file mode 100644
index 0000000000000..d6a726fe71017
--- /dev/null
+++ b/bundles/org.openhab.persistence.mongodb/.classpath
@@ -0,0 +1,27 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/bundles/org.openhab.persistence.mongodb/.project b/bundles/org.openhab.persistence.mongodb/.project
new file mode 100644
index 0000000000000..2715fbf38207f
--- /dev/null
+++ b/bundles/org.openhab.persistence.mongodb/.project
@@ -0,0 +1,23 @@
+
+
+ org.openhab.persistence.mongodb
+
+
+
+
+
+ org.eclipse.jdt.core.javabuilder
+
+
+
+
+ org.eclipse.m2e.core.maven2Builder
+
+
+
+
+
+ org.eclipse.jdt.core.javanature
+ org.eclipse.m2e.core.maven2Nature
+
+
diff --git a/bundles/org.openhab.persistence.mongodb/.settings/org.eclipse.core.resources.prefs b/bundles/org.openhab.persistence.mongodb/.settings/org.eclipse.core.resources.prefs
new file mode 100644
index 0000000000000..e9441bb123ec3
--- /dev/null
+++ b/bundles/org.openhab.persistence.mongodb/.settings/org.eclipse.core.resources.prefs
@@ -0,0 +1,3 @@
+eclipse.preferences.version=1
+encoding//src/main/java=UTF-8
+encoding/=UTF-8
diff --git a/bundles/org.openhab.persistence.mongodb/.settings/org.eclipse.jdt.core.prefs b/bundles/org.openhab.persistence.mongodb/.settings/org.eclipse.jdt.core.prefs
new file mode 100644
index 0000000000000..29fe717b8794b
--- /dev/null
+++ b/bundles/org.openhab.persistence.mongodb/.settings/org.eclipse.jdt.core.prefs
@@ -0,0 +1,11 @@
+eclipse.preferences.version=1
+org.eclipse.jdt.core.compiler.codegen.inlineJsrBytecode=enabled
+org.eclipse.jdt.core.compiler.codegen.targetPlatform=11
+org.eclipse.jdt.core.compiler.compliance=11
+org.eclipse.jdt.core.compiler.problem.assertIdentifier=error
+org.eclipse.jdt.core.compiler.problem.enablePreviewFeatures=disabled
+org.eclipse.jdt.core.compiler.problem.enumIdentifier=error
+org.eclipse.jdt.core.compiler.problem.forbiddenReference=warning
+org.eclipse.jdt.core.compiler.problem.reportPreviewFeatures=warning
+org.eclipse.jdt.core.compiler.release=disabled
+org.eclipse.jdt.core.compiler.source=11
diff --git a/bundles/org.openhab.persistence.mongodb/.settings/org.eclipse.jdt.ui.prefs b/bundles/org.openhab.persistence.mongodb/.settings/org.eclipse.jdt.ui.prefs
new file mode 100644
index 0000000000000..fe89f28bca590
--- /dev/null
+++ b/bundles/org.openhab.persistence.mongodb/.settings/org.eclipse.jdt.ui.prefs
@@ -0,0 +1,2 @@
+eclipse.preferences.version=1
+formatter_settings_version=12
diff --git a/bundles/org.openhab.persistence.mongodb/.settings/org.eclipse.m2e.core.prefs b/bundles/org.openhab.persistence.mongodb/.settings/org.eclipse.m2e.core.prefs
new file mode 100644
index 0000000000000..f897a7f1cb238
--- /dev/null
+++ b/bundles/org.openhab.persistence.mongodb/.settings/org.eclipse.m2e.core.prefs
@@ -0,0 +1,4 @@
+activeProfiles=
+eclipse.preferences.version=1
+resolveWorkspaceProjects=true
+version=1
diff --git a/bundles/org.openhab.persistence.mongodb/.settings/org.eclipse.pde.core.prefs b/bundles/org.openhab.persistence.mongodb/.settings/org.eclipse.pde.core.prefs
new file mode 100644
index 0000000000000..f29e940a0059c
--- /dev/null
+++ b/bundles/org.openhab.persistence.mongodb/.settings/org.eclipse.pde.core.prefs
@@ -0,0 +1,3 @@
+eclipse.preferences.version=1
+pluginProject.extensions=false
+resolve.requirebundle=false
diff --git a/bundles/org.openhab.persistence.mongodb/NOTICE b/bundles/org.openhab.persistence.mongodb/NOTICE
new file mode 100644
index 0000000000000..6c17d0d8a455b
--- /dev/null
+++ b/bundles/org.openhab.persistence.mongodb/NOTICE
@@ -0,0 +1,14 @@
+This content is produced and maintained by the openHAB project.
+
+* Project home: https://www.openhab.org
+
+== Declared Project Licenses
+
+This program and the accompanying materials are made available under the terms
+of the Eclipse Public License 2.0 which is available at
+https://www.eclipse.org/legal/epl-2.0/.
+
+== Source Code
+
+https://github.com/openhab/openhab-core
+
diff --git a/bundles/org.openhab.persistence.mongodb/README.md b/bundles/org.openhab.persistence.mongodb/README.md
new file mode 100644
index 0000000000000..479f668c2b895
--- /dev/null
+++ b/bundles/org.openhab.persistence.mongodb/README.md
@@ -0,0 +1,15 @@
+# MongoDB Persistence
+
+This service allows you to persist state updates using the MongoDB database. It supports writing information to a MongoDB document store, as well as querying from it.
+
+## Configuration
+
+This service can be configured in the file `services/mongodb.cfg`.
+
+| Property | Default | Required | Description |
+|----------|---------|:--------:|-------------|
+| url | | Yes | connection URL to address Mongodb. For example, `mongodb://localhost:27017` |
+| database | | Yes | database name |
+| collection | | Yes | collection name |
+
+All item and event related configuration is done in the file `persistence/mongodb.persist`.
diff --git a/bundles/org.openhab.persistence.mongodb/pom.xml b/bundles/org.openhab.persistence.mongodb/pom.xml
new file mode 100644
index 0000000000000..1e1d555ebab5a
--- /dev/null
+++ b/bundles/org.openhab.persistence.mongodb/pom.xml
@@ -0,0 +1,25 @@
+
+
+
+ 4.0.0
+
+
+ org.openhab.addons.bundles
+ org.openhab.addons.reactor.bundles
+ 3.0.0-SNAPSHOT
+
+
+ org.openhab.persistence.mongodb
+
+ openHAB Add-ons :: Bundles :: Persistence Service :: MongoDB
+
+
+
+
+ org.mongodb
+ mongo-java-driver
+ 2.13.1
+
+
+
diff --git a/bundles/org.openhab.persistence.mongodb/src/main/feature/feature.xml b/bundles/org.openhab.persistence.mongodb/src/main/feature/feature.xml
new file mode 100644
index 0000000000000..e00992cb6ae2a
--- /dev/null
+++ b/bundles/org.openhab.persistence.mongodb/src/main/feature/feature.xml
@@ -0,0 +1,10 @@
+
+
+ mvn:org.openhab.core.features.karaf/org.openhab.core.features.karaf.openhab-core/${ohc.version}/xml/features
+
+
+ openhab-runtime-base
+ mvn:org.openhab.addons.bundles/org.openhab.persistence.mongodb/${project.version}
+
+
+
diff --git a/bundles/org.openhab.persistence.mongodb/src/main/java/org/openhab/persistence/mongodb/internal/MongoDBItem.java b/bundles/org.openhab.persistence.mongodb/src/main/java/org/openhab/persistence/mongodb/internal/MongoDBItem.java
new file mode 100644
index 0000000000000..c6388a949f024
--- /dev/null
+++ b/bundles/org.openhab.persistence.mongodb/src/main/java/org/openhab/persistence/mongodb/internal/MongoDBItem.java
@@ -0,0 +1,60 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.mongodb.internal;
+
+import java.text.DateFormat;
+import java.util.Date;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.types.State;
+
+/**
+ * This is the implementation of the MongoDB historic item.
+ *
+ * @author Thorsten Hoeger - Initial contribution
+ */
+@NonNullByDefault
+public class MongoDBItem implements HistoricItem {
+
+ private final String name;
+ private final State state;
+ private final Date timestamp;
+
+ public MongoDBItem(String name, State state, Date timestamp) {
+ this.name = name;
+ this.state = state;
+ this.timestamp = timestamp;
+ }
+
+ @Override
+ public String getName() {
+ return name;
+ }
+
+ @Override
+ public State getState() {
+ return state;
+ }
+
+ @Override
+ public Date getTimestamp() {
+ return timestamp;
+ }
+
+ @Override
+ public String toString() {
+ return DateFormat.getDateTimeInstance().format(timestamp) + ": " + name + " -> " + state.toString();
+ }
+
+}
\ No newline at end of file
diff --git a/bundles/org.openhab.persistence.mongodb/src/main/java/org/openhab/persistence/mongodb/internal/MongoDBPersistenceService.java b/bundles/org.openhab.persistence.mongodb/src/main/java/org/openhab/persistence/mongodb/internal/MongoDBPersistenceService.java
new file mode 100644
index 0000000000000..1a5ba650c7af8
--- /dev/null
+++ b/bundles/org.openhab.persistence.mongodb/src/main/java/org/openhab/persistence/mongodb/internal/MongoDBPersistenceService.java
@@ -0,0 +1,355 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.mongodb.internal;
+
+import java.util.ArrayList;
+import java.util.Calendar;
+import java.util.Collections;
+import java.util.Date;
+import java.util.List;
+import java.util.Locale;
+import java.util.Map;
+import java.util.Set;
+
+import org.apache.commons.lang.StringUtils;
+import org.bson.types.ObjectId;
+import org.eclipse.jdt.annotation.NonNull;
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.core.items.Item;
+import org.openhab.core.items.ItemNotFoundException;
+import org.openhab.core.items.ItemRegistry;
+import org.openhab.core.library.items.ContactItem;
+import org.openhab.core.library.items.DateTimeItem;
+import org.openhab.core.library.items.DimmerItem;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.core.library.items.RollershutterItem;
+import org.openhab.core.library.items.SwitchItem;
+import org.openhab.core.library.types.DateTimeType;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.library.types.OpenClosedType;
+import org.openhab.core.library.types.PercentType;
+import org.openhab.core.library.types.StringType;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.FilterCriteria.Operator;
+import org.openhab.core.persistence.FilterCriteria.Ordering;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.persistence.PersistenceItemInfo;
+import org.openhab.core.persistence.PersistenceService;
+import org.openhab.core.persistence.QueryablePersistenceService;
+import org.openhab.core.persistence.strategy.PersistenceStrategy;
+import org.openhab.core.types.State;
+import org.openhab.core.types.UnDefType;
+import org.osgi.framework.BundleContext;
+import org.osgi.service.component.annotations.Activate;
+import org.osgi.service.component.annotations.Component;
+import org.osgi.service.component.annotations.ConfigurationPolicy;
+import org.osgi.service.component.annotations.Deactivate;
+import org.osgi.service.component.annotations.Reference;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import com.mongodb.BasicDBObject;
+import com.mongodb.DBCollection;
+import com.mongodb.DBCursor;
+import com.mongodb.DBObject;
+import com.mongodb.MongoClient;
+import com.mongodb.MongoClientURI;
+
+/**
+ * This is the implementation of the MongoDB {@link PersistenceService}.
+ *
+ * @author Thorsten Hoeger - Initial contribution
+ */
+@NonNullByDefault
+@Component(service = { PersistenceService.class,
+ QueryablePersistenceService.class }, configurationPid = "org.openhab.mongodb", configurationPolicy = ConfigurationPolicy.REQUIRE)
+public class MongoDBPersistenceService implements QueryablePersistenceService {
+
+ private static final String FIELD_ID = "_id";
+ private static final String FIELD_ITEM = "item";
+ private static final String FIELD_REALNAME = "realName";
+ private static final String FIELD_TIMESTAMP = "timestamp";
+ private static final String FIELD_VALUE = "value";
+
+ private static final Logger logger = LoggerFactory.getLogger(MongoDBPersistenceService.class);
+
+ private @NonNullByDefault({}) String url;
+ private @NonNullByDefault({}) String db;
+ private @NonNullByDefault({}) String collection;
+
+ private boolean initialized = false;
+
+ @Reference
+ protected @NonNullByDefault({}) ItemRegistry itemRegistry;
+
+ private @NonNullByDefault({}) MongoClient cl;
+ private @NonNullByDefault({}) DBCollection mongoCollection;
+
+ @Activate
+ public void activate(final BundleContext bundleContext, final Map config) {
+ url = (String) config.get("url");
+ logger.debug("MongoDB URL {}", url);
+ if (StringUtils.isBlank(url)) {
+ logger.warn("The MongoDB database URL is missing - please configure the mongodb:url parameter.");
+ return;
+ }
+ db = (String) config.get("database");
+ logger.debug("MongoDB database {}", db);
+ if (StringUtils.isBlank(db)) {
+ logger.warn("The MongoDB database name is missing - please configure the mongodb:database parameter.");
+ return;
+ }
+ collection = (String) config.get("collection");
+ logger.debug("MongoDB collection {}", collection);
+ if (StringUtils.isBlank(collection)) {
+ logger.warn(
+ "The MongoDB database collection is missing - please configure the mongodb:collection parameter.");
+ return;
+ }
+
+ disconnectFromDatabase();
+ connectToDatabase();
+
+ // connection has been established... initialization completed!
+ initialized = true;
+ }
+
+ @Deactivate
+ public void deactivate(final int reason) {
+ logger.debug("MongoDB persistence bundle stopping. Disconnecting from database.");
+ disconnectFromDatabase();
+ }
+
+ @Override
+ public String getId() {
+ return "mongodb";
+ }
+
+ @Override
+ public String getLabel(@Nullable Locale locale) {
+ return "Mongo DB";
+ }
+
+ @Override
+ public void store(Item item, @Nullable String alias) {
+ // Don't log undefined/uninitialised data
+ if (item.getState() instanceof UnDefType) {
+ return;
+ }
+
+ // If we've not initialised the bundle, then return
+ if (initialized == false) {
+ logger.warn("MongoDB not initialized");
+ return;
+ }
+
+ // Connect to mongodb server if we're not already connected
+ if (!isConnected()) {
+ connectToDatabase();
+ }
+
+ // If we still didn't manage to connect, then return!
+ if (!isConnected()) {
+ logger.warn(
+ "mongodb: No connection to database. Cannot persist item '{}'! Will retry connecting to database next time.",
+ item);
+ return;
+ }
+
+ String realName = item.getName();
+ String name = (alias != null) ? alias : realName;
+ Object value = this.convertValue(item.getState());
+
+ DBObject obj = new BasicDBObject();
+ obj.put(FIELD_ID, new ObjectId());
+ obj.put(FIELD_ITEM, name);
+ obj.put(FIELD_REALNAME, realName);
+ obj.put(FIELD_TIMESTAMP, new Date());
+ obj.put(FIELD_VALUE, value);
+ this.mongoCollection.save(obj);
+
+ logger.debug("MongoDB save {}={}", name, value);
+ }
+
+ private Object convertValue(State state) {
+ Object value;
+ if (state instanceof PercentType) {
+ value = ((PercentType) state).toBigDecimal().doubleValue();
+ } else if (state instanceof DateTimeType) {
+ value = ((DateTimeType) state).getCalendar().getTime();
+ } else if (state instanceof DecimalType) {
+ value = ((DecimalType) state).toBigDecimal().doubleValue();
+ } else {
+ value = state.toString();
+ }
+ return value;
+ }
+
+ /**
+ * @{inheritDoc
+ */
+ @Override
+ public void store(Item item) {
+ store(item, null);
+ }
+
+ @Override
+ public @NonNull Set<@NonNull PersistenceItemInfo> getItemInfo() {
+ return Collections.emptySet();
+ }
+
+ /**
+ * Checks if we have a database connection
+ *
+ * @return true if connection has been established, false otherwise
+ */
+ private boolean isConnected() {
+ return cl != null;
+ }
+
+ /**
+ * Connects to the database
+ */
+ private void connectToDatabase() {
+ try {
+ logger.debug("Connect MongoDB");
+ this.cl = new MongoClient(new MongoClientURI(this.url));
+ mongoCollection = cl.getDB(this.db).getCollection(this.collection);
+
+ BasicDBObject idx = new BasicDBObject();
+ idx.append(FIELD_TIMESTAMP, 1).append(FIELD_ITEM, 1);
+ this.mongoCollection.createIndex(idx);
+ logger.debug("Connect MongoDB ... done");
+ } catch (Exception e) {
+ logger.error("Failed to connect to database {}", this.url);
+ throw new RuntimeException("Cannot connect to database", e);
+ }
+ }
+
+ /**
+ * Disconnects from the database
+ */
+ private void disconnectFromDatabase() {
+ this.mongoCollection = null;
+ if (this.cl != null) {
+ this.cl.close();
+ }
+ cl = null;
+ }
+
+ @Override
+ public Iterable query(FilterCriteria filter) {
+ if (!initialized) {
+ return Collections.emptyList();
+ }
+
+ if (!isConnected()) {
+ connectToDatabase();
+ }
+
+ if (!isConnected()) {
+ return Collections.emptyList();
+ }
+
+ String name = filter.getItemName();
+ Item item = getItem(name);
+
+ List items = new ArrayList();
+ DBObject query = new BasicDBObject();
+ if (filter.getItemName() != null) {
+ query.put(FIELD_ITEM, filter.getItemName());
+ }
+ if (filter.getState() != null && filter.getOperator() != null) {
+ String op = convertOperator(filter.getOperator());
+ Object value = convertValue(filter.getState());
+ query.put(FIELD_VALUE, new BasicDBObject(op, value));
+ }
+ if (filter.getBeginDate() != null) {
+ query.put(FIELD_TIMESTAMP, new BasicDBObject("$gte", filter.getBeginDate()));
+ }
+ if (filter.getEndDate() != null) {
+ query.put(FIELD_TIMESTAMP, new BasicDBObject("$lte", filter.getEndDate()));
+ }
+
+ Integer sortDir = (filter.getOrdering() == Ordering.ASCENDING) ? 1 : -1;
+ DBCursor cursor = this.mongoCollection.find(query).sort(new BasicDBObject(FIELD_TIMESTAMP, sortDir))
+ .skip(filter.getPageNumber() * filter.getPageSize()).limit(filter.getPageSize());
+
+ while (cursor.hasNext()) {
+ BasicDBObject obj = (BasicDBObject) cursor.next();
+
+ final State state;
+ if (item instanceof NumberItem) {
+ state = new DecimalType(obj.getDouble(FIELD_VALUE));
+ } else if (item instanceof DimmerItem) {
+ state = new PercentType(obj.getInt(FIELD_VALUE));
+ } else if (item instanceof SwitchItem) {
+ state = OnOffType.valueOf(obj.getString(FIELD_VALUE));
+ } else if (item instanceof ContactItem) {
+ state = OpenClosedType.valueOf(obj.getString(FIELD_VALUE));
+ } else if (item instanceof RollershutterItem) {
+ state = new PercentType(obj.getInt(FIELD_VALUE));
+ } else if (item instanceof DateTimeItem) {
+ Calendar cal = Calendar.getInstance();
+ cal.setTime(obj.getDate(FIELD_VALUE));
+ state = new DateTimeType(cal);
+ } else {
+ state = new StringType(obj.getString(FIELD_VALUE));
+ }
+
+ items.add(new MongoDBItem(name, state, obj.getDate(FIELD_TIMESTAMP)));
+ }
+
+ return items;
+ }
+
+ private @Nullable String convertOperator(Operator operator) {
+ switch (operator) {
+ case EQ:
+ return "$eq";
+ case GT:
+ return "$gt";
+ case GTE:
+ return "$gte";
+ case LT:
+ return "$lt";
+ case LTE:
+ return "$lte";
+ case NEQ:
+ return "$neq";
+ default:
+ return null;
+ }
+ }
+
+ private @Nullable Item getItem(String itemName) {
+ Item item = null;
+ try {
+ if (itemRegistry != null) {
+ item = itemRegistry.getItem(itemName);
+ }
+ } catch (ItemNotFoundException e1) {
+ logger.error("Unable to get item type for {}", itemName);
+ // Set type to null - data will be returned as StringType
+ item = null;
+ }
+ return item;
+ }
+
+ @Override
+ public List getDefaultStrategies() {
+ return Collections.emptyList();
+ }
+}
diff --git a/bundles/org.openhab.persistence.mysql/.classpath b/bundles/org.openhab.persistence.mysql/.classpath
new file mode 100644
index 0000000000000..0f47c9eacb373
--- /dev/null
+++ b/bundles/org.openhab.persistence.mysql/.classpath
@@ -0,0 +1,28 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/bundles/org.openhab.persistence.mysql/.project b/bundles/org.openhab.persistence.mysql/.project
new file mode 100644
index 0000000000000..6cbc901270ff7
--- /dev/null
+++ b/bundles/org.openhab.persistence.mysql/.project
@@ -0,0 +1,23 @@
+
+
+ org.openhab.persistence.mysql
+
+
+
+
+
+ org.eclipse.jdt.core.javabuilder
+
+
+
+
+ org.eclipse.m2e.core.maven2Builder
+
+
+
+
+
+ org.eclipse.jdt.core.javanature
+ org.eclipse.m2e.core.maven2Nature
+
+
diff --git a/bundles/org.openhab.persistence.mysql/.settings/org.eclipse.core.resources.prefs b/bundles/org.openhab.persistence.mysql/.settings/org.eclipse.core.resources.prefs
new file mode 100644
index 0000000000000..e9441bb123ec3
--- /dev/null
+++ b/bundles/org.openhab.persistence.mysql/.settings/org.eclipse.core.resources.prefs
@@ -0,0 +1,3 @@
+eclipse.preferences.version=1
+encoding//src/main/java=UTF-8
+encoding/=UTF-8
diff --git a/bundles/org.openhab.persistence.mysql/.settings/org.eclipse.jdt.core.prefs b/bundles/org.openhab.persistence.mysql/.settings/org.eclipse.jdt.core.prefs
new file mode 100644
index 0000000000000..e8c450c015943
--- /dev/null
+++ b/bundles/org.openhab.persistence.mysql/.settings/org.eclipse.jdt.core.prefs
@@ -0,0 +1,11 @@
+eclipse.preferences.version=1
+org.eclipse.jdt.core.compiler.codegen.inlineJsrBytecode=enabled
+org.eclipse.jdt.core.compiler.codegen.targetPlatform=11
+org.eclipse.jdt.core.compiler.compliance=11
+org.eclipse.jdt.core.compiler.problem.assertIdentifier=error
+org.eclipse.jdt.core.compiler.problem.enablePreviewFeatures=disabled
+org.eclipse.jdt.core.compiler.problem.enumIdentifier=error
+org.eclipse.jdt.core.compiler.problem.forbiddenReference=warning
+org.eclipse.jdt.core.compiler.problem.reportPreviewFeatures=warning
+org.eclipse.jdt.core.compiler.release=enabled
+org.eclipse.jdt.core.compiler.source=11
diff --git a/bundles/org.openhab.persistence.mysql/.settings/org.eclipse.jdt.ui.prefs b/bundles/org.openhab.persistence.mysql/.settings/org.eclipse.jdt.ui.prefs
new file mode 100644
index 0000000000000..fe89f28bca590
--- /dev/null
+++ b/bundles/org.openhab.persistence.mysql/.settings/org.eclipse.jdt.ui.prefs
@@ -0,0 +1,2 @@
+eclipse.preferences.version=1
+formatter_settings_version=12
diff --git a/bundles/org.openhab.persistence.mysql/.settings/org.eclipse.m2e.core.prefs b/bundles/org.openhab.persistence.mysql/.settings/org.eclipse.m2e.core.prefs
new file mode 100644
index 0000000000000..f897a7f1cb238
--- /dev/null
+++ b/bundles/org.openhab.persistence.mysql/.settings/org.eclipse.m2e.core.prefs
@@ -0,0 +1,4 @@
+activeProfiles=
+eclipse.preferences.version=1
+resolveWorkspaceProjects=true
+version=1
diff --git a/bundles/org.openhab.persistence.mysql/.settings/org.eclipse.pde.core.prefs b/bundles/org.openhab.persistence.mysql/.settings/org.eclipse.pde.core.prefs
new file mode 100644
index 0000000000000..f29e940a0059c
--- /dev/null
+++ b/bundles/org.openhab.persistence.mysql/.settings/org.eclipse.pde.core.prefs
@@ -0,0 +1,3 @@
+eclipse.preferences.version=1
+pluginProject.extensions=false
+resolve.requirebundle=false
diff --git a/bundles/org.openhab.persistence.mysql/NOTICE b/bundles/org.openhab.persistence.mysql/NOTICE
new file mode 100644
index 0000000000000..6c17d0d8a455b
--- /dev/null
+++ b/bundles/org.openhab.persistence.mysql/NOTICE
@@ -0,0 +1,14 @@
+This content is produced and maintained by the openHAB project.
+
+* Project home: https://www.openhab.org
+
+== Declared Project Licenses
+
+This program and the accompanying materials are made available under the terms
+of the Eclipse Public License 2.0 which is available at
+https://www.eclipse.org/legal/epl-2.0/.
+
+== Source Code
+
+https://github.com/openhab/openhab-core
+
diff --git a/bundles/org.openhab.persistence.mysql/README.md b/bundles/org.openhab.persistence.mysql/README.md
new file mode 100644
index 0000000000000..98a57e9df841f
--- /dev/null
+++ b/bundles/org.openhab.persistence.mysql/README.md
@@ -0,0 +1,23 @@
+# MySQL Persistence
+
+This service allows you to persist state updates using the [MySQL](https://www.mysql.com/) database. Note that other SQL databases need a separate service due to incompatibilities between different SQL databases.
+
+This persistence service supports writing information to MySQL relational database systems, as well as querying from them.
+
+The service will create a mapping table called `Items` to link each item to a table, and a separate table is generated for each item. The item data tables include the time and data - the data type is dependent on the item type and allows the item state to be recovered back into openHAB in the same way it was stored.
+
+## Configuration
+
+This service can be configured in the file `services/mysql.cfg`.
+
+| Property | Default | Required | Description |
+|----------|---------|:--------:|-------------|
+| url | | Yes | database URL, in the format `jdbc:mysql://:/`. For example, `jdbc:mysql://127.0.0.1/openhab` |
+| user | | if needed | database user |
+| password | | if needed | database password |
+| reconnectCnt | | No | reconnection counter. Setting this to 1 will cause the service to close the connection and reconnect if there are any errors. |
+| waitTimeout | | No | connection timeout (in seconds). This sets the number of seconds that MySQL will keep the session open without any transactions. It should default to 8 hours within mySQL, but some implementations may use lower values (possibly as low as 60 seconds) which would cause unnecessary reconnections. This value needs to be set higher than the maximum logging period. |
+| sqltype.string | | No | mapping of an openHAB item type to an SQL data type. See [this issue](https://github.com/openhab/openhab1-addons/issues/710) for more information. |
+| localtime | `false` | No | use MySQL server time to store item values (if set to `false`) or use openHAB server time (if set to `true`). For new installations, setting this to `true` is recommended. |
+
+All item and event related configuration is done in the file `persistence/mysql.persist`.
diff --git a/bundles/org.openhab.persistence.mysql/pom.xml b/bundles/org.openhab.persistence.mysql/pom.xml
new file mode 100644
index 0000000000000..e097408a4574c
--- /dev/null
+++ b/bundles/org.openhab.persistence.mysql/pom.xml
@@ -0,0 +1,29 @@
+
+
+
+ 4.0.0
+
+
+ org.openhab.addons.bundles
+ org.openhab.addons.reactor.bundles
+ 3.0.0-SNAPSHOT
+
+
+ org.openhab.persistence.mysql
+
+ openHAB Add-ons :: Bundles :: Persistence Service :: MySQL
+
+
+ !com.mongodb.*,!io.netty.*,!com.bea.*,!io.reactivex.*,!org.reactivestreams.*,!de.erichseifert.*,!org.w3c.*,!org.jvnet.*,!com.ctc.*,!com.sun.*,!com.sleepycat.*,!dagger.*,!org.codehaus.*,!org.glassfish.*,!com.ibm.*,!javax.xml.*,!net.sf.*,!nu.xom.*,!org.bson.*,!org.dom4j.*,!org.jdom.*,!org.jdom2.*,!org.kxml2.io.*,!org.xmlpull.*,!sun.*,!com.mchange.*,!org.jboss.*,!com.google.protobuf
+
+
+
+
+
+ mysql
+ mysql-connector-java
+ 8.0.13
+
+
+
diff --git a/bundles/org.openhab.persistence.mysql/src/main/feature/feature.xml b/bundles/org.openhab.persistence.mysql/src/main/feature/feature.xml
new file mode 100644
index 0000000000000..9a21f0d59d72f
--- /dev/null
+++ b/bundles/org.openhab.persistence.mysql/src/main/feature/feature.xml
@@ -0,0 +1,11 @@
+
+
+ mvn:org.openhab.core.features.karaf/org.openhab.core.features.karaf.openhab-core/${ohc.version}/xml/features
+
+
+ openhab-runtime-base
+ mvn:org.openhab.addons.bundles/org.openhab.persistence.mysql/${project.version}
+ mvn:${project.groupId}/openhab-addons-external3/${project.version}/cfg/mysql
+
+
+
diff --git a/bundles/org.openhab.persistence.mysql/src/main/java/org/openhab/persistence/mysql/internal/MysqlItem.java b/bundles/org.openhab.persistence.mysql/src/main/java/org/openhab/persistence/mysql/internal/MysqlItem.java
new file mode 100644
index 0000000000000..128f122604606
--- /dev/null
+++ b/bundles/org.openhab.persistence.mysql/src/main/java/org/openhab/persistence/mysql/internal/MysqlItem.java
@@ -0,0 +1,60 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.mysql.internal;
+
+import java.text.DateFormat;
+import java.util.Date;
+
+import org.openhab.core.types.State;
+import org.openhab.core.persistence.HistoricItem;
+
+/**
+ * This is a Java bean used to return historic items from a SQL database.
+ *
+ * @author Chris Jackson
+ * @since 1.3.0
+ *
+ */
+public class MysqlItem implements HistoricItem {
+
+ final private String name;
+ final private State state;
+ final private Date timestamp;
+
+ public MysqlItem(String name, State state, Date timestamp) {
+ this.name = name;
+ this.state = state;
+ this.timestamp = timestamp;
+ }
+
+ @Override
+ public String getName() {
+ return name;
+ }
+
+ @Override
+ public State getState() {
+ return state;
+ }
+
+ @Override
+ public Date getTimestamp() {
+ return timestamp;
+ }
+
+ @Override
+ public String toString() {
+ return DateFormat.getDateTimeInstance().format(timestamp) + ": " + name + " -> " + state.toString();
+ }
+
+}
\ No newline at end of file
diff --git a/bundles/org.openhab.persistence.mysql/src/main/java/org/openhab/persistence/mysql/internal/MysqlPersistenceService.java b/bundles/org.openhab.persistence.mysql/src/main/java/org/openhab/persistence/mysql/internal/MysqlPersistenceService.java
new file mode 100644
index 0000000000000..3f858bfb893db
--- /dev/null
+++ b/bundles/org.openhab.persistence.mysql/src/main/java/org/openhab/persistence/mysql/internal/MysqlPersistenceService.java
@@ -0,0 +1,716 @@
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.mysql.internal;
+
+import java.sql.Connection;
+import java.sql.DriverManager;
+import java.sql.PreparedStatement;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.sql.Timestamp;
+import java.text.SimpleDateFormat;
+import java.util.ArrayList;
+import java.util.Calendar;
+import java.util.Collections;
+import java.util.Formatter;
+import java.util.HashMap;
+import java.util.Iterator;
+import java.util.List;
+import java.util.Locale;
+import java.util.Map;
+import java.util.Set;
+import java.util.regex.Matcher;
+import java.util.regex.Pattern;
+
+import org.apache.commons.lang.StringUtils;
+import org.eclipse.jdt.annotation.NonNull;
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.core.items.GroupItem;
+import org.openhab.core.items.Item;
+import org.openhab.core.items.ItemNotFoundException;
+import org.openhab.core.items.ItemRegistry;
+import org.openhab.core.library.items.ColorItem;
+import org.openhab.core.library.items.ContactItem;
+import org.openhab.core.library.items.DateTimeItem;
+import org.openhab.core.library.items.DimmerItem;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.core.library.items.RollershutterItem;
+import org.openhab.core.library.items.SwitchItem;
+import org.openhab.core.library.types.DateTimeType;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.library.types.HSBType;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.library.types.OpenClosedType;
+import org.openhab.core.library.types.PercentType;
+import org.openhab.core.library.types.StringType;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.FilterCriteria.Ordering;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.persistence.PersistenceItemInfo;
+import org.openhab.core.persistence.PersistenceService;
+import org.openhab.core.persistence.QueryablePersistenceService;
+import org.openhab.core.persistence.strategy.PersistenceStrategy;
+import org.openhab.core.types.State;
+import org.openhab.core.types.UnDefType;
+import org.osgi.framework.BundleContext;
+import org.osgi.service.component.annotations.Activate;
+import org.osgi.service.component.annotations.Component;
+import org.osgi.service.component.annotations.ConfigurationPolicy;
+import org.osgi.service.component.annotations.Deactivate;
+import org.osgi.service.component.annotations.Reference;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * This is the implementation of the mySQL {@link PersistenceService}.
+ *
+ * Data is persisted with the following conversions:
+ *
+ * Item-Type Data-Type MySQL-Type
+ * ========= ========= ==========
+ * ColorItem HSBType CHAR(25)
+ * ContactItem OnOffType CHAR(6)
+ * DateTimeItem DateTimeType DATETIME
+ * DimmerItem PercentType TINYINT
+ * NumberItem DecimalType DOUBLE
+ * RollershutterItem PercentType TINYINT
+ * StringItem StringType VARCHAR(20000)
+ * SwitchItem OnOffType CHAR(3)
+ *
+ * In the store method, type conversion is performed where the default type for
+ * an item is not as above. For example, DimmerType can return OnOffType, so to
+ * keep the best resolution, we store as a number in SQL and convert to
+ * DecimalType before persisting to MySQL.
+ *
+ * @author Henrik Sjöstrand - Initial contribution
+ * @author Thomas.Eichstaedt-Engelen - Enhancements
+ * @author Chris Jackson - Enhancements
+ * @author Helmut Lehmeyer - Enhancements
+ */
+@NonNullByDefault
+@Component(service = { PersistenceService.class,
+ QueryablePersistenceService.class }, configurationPid = "org.openhab.mysql", configurationPolicy = ConfigurationPolicy.REQUIRE)
+public class MysqlPersistenceService implements QueryablePersistenceService {
+
+ private static final Pattern EXTRACT_CONFIG_PATTERN = Pattern.compile("^(.*?)\\.([0-9.a-zA-Z]+)$");
+
+ private static final Logger logger = LoggerFactory.getLogger(MysqlPersistenceService.class);
+
+ private String driverClass = "com.mysql.jdbc.Driver";
+ private @NonNullByDefault({}) String url;
+ private @NonNullByDefault({}) String user;
+ private @NonNullByDefault({}) String password;
+
+ private boolean initialized = false;
+
+ @Reference
+ protected @NonNullByDefault({}) ItemRegistry itemRegistry;
+
+ // Error counter - used to reconnect to database on error
+ private int errCnt;
+ private int errReconnectThreshold = 0;
+
+ private int waitTimeout = -1;
+
+ // Time used for persisting items, False: MySQL Server time (default), True: openHAB Server time
+ private boolean localtime = false;
+
+ private @NonNullByDefault({}) Connection connection = null;
+
+ private Map sqlTables = new HashMap<>();
+ private Map sqlTypes = new HashMap<>();
+
+ /**
+ * Initialise the type array
+ * If other Types like DOUBLE or INT needed for serialisation it can be set in openhab.cfg
+ */
+ @Activate
+ public void activate(final BundleContext bundleContext, final Map config) {
+ sqlTypes.put("CALLITEM", "VARCHAR(200)");
+ sqlTypes.put("COLORITEM", "VARCHAR(70)");
+ sqlTypes.put("CONTACTITEM", "VARCHAR(6)");
+ sqlTypes.put("DATETIMEITEM", "DATETIME");
+ sqlTypes.put("DIMMERITEM", "TINYINT");
+ sqlTypes.put("LOCATIONITEM", "VARCHAR(30)");
+ sqlTypes.put("NUMBERITEM", "DOUBLE");
+ sqlTypes.put("ROLLERSHUTTERITEM", "TINYINT");
+ sqlTypes.put("STRINGITEM", "VARCHAR(20000)");
+ sqlTypes.put("SWITCHITEM", "CHAR(3)");
+
+ Iterator keys = config.keySet().iterator();
+ while (keys.hasNext()) {
+ String key = keys.next();
+
+ Matcher matcher = EXTRACT_CONFIG_PATTERN.matcher(key);
+
+ if (!matcher.matches()) {
+ continue;
+ }
+
+ matcher.reset();
+ matcher.find();
+
+ if (!matcher.group(1).equals("sqltype")) {
+ continue;
+ }
+
+ String itemType = matcher.group(2).toUpperCase() + "ITEM";
+ String value = (String) config.get(key);
+
+ sqlTypes.put(itemType, value);
+ }
+
+ disconnectFromDatabase();
+
+ url = (String) config.get("url");
+ if (StringUtils.isBlank(url)) {
+ logger.warn("The mySQL database URL is missing. Please configure the url parameter in the configuration.");
+ return;
+ }
+
+ user = (String) config.get("user");
+ if (StringUtils.isBlank(user)) {
+ logger.warn("The mySQL user is missing. Please configure the user parameter in the configuration.");
+ return;
+ }
+
+ password = (String) config.get("password");
+ if (StringUtils.isBlank(password)) {
+ logger.warn(
+ "The mySQL password is missing; attempting to connect without password. To specify a password, configure the password parameter in the configuration.");
+ }
+
+ String tmpString = (String) config.get("reconnectCnt");
+ if (StringUtils.isNotBlank(tmpString)) {
+ errReconnectThreshold = Integer.parseInt(tmpString);
+ }
+
+ tmpString = (String) config.get("waitTimeout");
+ if (StringUtils.isNotBlank(tmpString)) {
+ waitTimeout = Integer.parseInt(tmpString);
+ }
+
+ tmpString = (String) config.get("localtime");
+ if (StringUtils.isNotBlank(tmpString)) {
+ localtime = Boolean.parseBoolean(tmpString);
+ }
+
+ // reconnect to the database in case the configuration has changed.
+ connectToDatabase();
+
+ // connection has been established ... initialization completed!
+ initialized = true;
+
+ logger.debug("mySQL configuration complete.");
+ }
+
+ @Deactivate
+ public void deactivate(final int reason) {
+ logger.debug("mySQL persistence bundle stopping. Disconnecting from database.");
+ disconnectFromDatabase();
+ }
+
+ @Override
+ public String getId() {
+ return "mysql";
+ }
+
+ @Override
+ public String getLabel(@Nullable Locale locale) {
+ return "MySQL";
+ }
+
+ /**
+ *
+ * @param i
+ * @return
+ */
+ private @Nullable String getItemType(Item i) {
+ Item item = i;
+ if (i instanceof GroupItem) {
+ item = ((GroupItem) i).getBaseItem();
+ if (item == null) {// if GroupItem: is not defined in *.items using StringType
+ logger.debug(
+ "mySQL: Cannot detect ItemType for {} because the GroupItems' base type isn't set in *.items file.",
+ i.getName());
+ item = ((GroupItem) i).getMembers().iterator().next();
+ }
+ }
+ String itemType = item.getClass().getSimpleName().toUpperCase();
+ if (sqlTypes.get(itemType) == null) {
+ logger.debug("mySQL: No sqlType found for ItemType {}, use ItemType STRINGITEM ({}) as Fallback for {}",
+ itemType, sqlTypes.get("STRINGITEM"), i.getName());
+ return sqlTypes.get("STRINGITEM");
+ }
+
+ logger.debug("mySQL: Use ItemType {} ({}) for Item {}", itemType, sqlTypes.get(itemType), i.getName());
+ return sqlTypes.get(itemType);
+ }
+
+ private @Nullable String getTable(Item item) {
+ String sqlCmd = null;
+ int rowId = 0;
+
+ String itemName = item.getName();
+ String tableName = sqlTables.get(itemName);
+
+ // Table already exists - return the name
+ if (tableName != null) {
+ return tableName;
+ }
+
+ logger.debug("mySQL: no Table found for itemName={} get:{}", itemName, sqlTables.get(itemName));
+
+ sqlCmd = new String("INSERT INTO Items (ItemName) VALUES (?)");
+
+ // Create a new entry in the Items table. This is the translation of
+ // item name to table
+ try (PreparedStatement statement = connection.prepareStatement(sqlCmd, Statement.RETURN_GENERATED_KEYS)) {
+ statement.setString(1, itemName);
+ statement.executeUpdate();
+
+ ResultSet resultSet = statement.getGeneratedKeys();
+ if (resultSet != null && resultSet.next()) {
+ rowId = resultSet.getInt(1);
+ }
+
+ if (rowId == 0) {
+ throw new SQLException("mySQL: Creating table for item '{}' failed.", itemName);
+ }
+
+ // Create the table name
+ tableName = new String("Item" + rowId);
+ logger.debug("mySQL: new item {} is Item{}", itemName, rowId);
+ } catch (SQLException e) {
+ errCnt++;
+ logger.error("mySQL: Could not create entry for '{}' in table 'Items' with statement '{}': {}", itemName,
+ sqlCmd, e.getMessage());
+ }
+
+ // An error occurred adding the item name into the index list!
+ if (tableName == null) {
+ logger.error("mySQL: tableName was null");
+ return null;
+ }
+
+ String mysqlType = getItemType(item);
+
+ // We have a rowId, create the table for the data
+ sqlCmd = new String(
+ "CREATE TABLE " + tableName + " (Time DATETIME, Value " + mysqlType + ", PRIMARY KEY(Time));");
+ logger.debug("mySQL: query: {}", sqlCmd);
+
+ try (PreparedStatement statement = connection.prepareStatement(sqlCmd)) {
+ statement.executeUpdate();
+
+ logger.debug("mySQL: Table created for item '{}' with datatype {} in SQL database.", itemName, mysqlType);
+ sqlTables.put(itemName, tableName);
+ } catch (Exception e) {
+ errCnt++;
+ logger.error("mySQL: Could not create table for item '{}' with statement '{}': {}", itemName, sqlCmd,
+ e.getMessage());
+ }
+
+ // Check if the new entry is in the table list
+ // If it's not in the list, then there was an error and we need to do some tidying up
+ // The item needs to be removed from the index table to avoid duplicates
+ if (sqlTables.get(itemName) == null) {
+ logger.error("mySQL: Item '{}' was not added to the table - removing index", itemName);
+ sqlCmd = new String("DELETE FROM Items WHERE ItemName=?");
+ logger.debug("mySQL: query: {}", sqlCmd);
+
+ try (PreparedStatement statement = connection.prepareStatement(sqlCmd);) {
+ statement.setString(1, itemName);
+ statement.executeUpdate();
+ } catch (Exception e) {
+ errCnt++;
+
+ logger.error("mySQL: Could not remove index for item '{}' with statement '{}': ", itemName, sqlCmd, e);
+ }
+ }
+
+ return tableName;
+ }
+
+ /**
+ * @{inheritDoc
+ */
+ @Override
+ public void store(Item item, @Nullable String alias) {
+ // Don't log undefined/uninitialised data
+ if (item.getState() instanceof UnDefType) {
+ return;
+ }
+
+ // If we've not initialised the bundle, then return
+ if (initialized == false) {
+ return;
+ }
+
+ // Connect to mySQL server if we're not already connected
+ if (!isConnected()) {
+ connectToDatabase();
+ }
+
+ // If we still didn't manage to connect, then return!
+ if (!isConnected()) {
+ logger.warn(
+ "mySQL: No connection to database. Cannot persist item '{}'! "
+ + "Will retry connecting to database when error count:{} equals errReconnectThreshold:{}",
+ item, errCnt, errReconnectThreshold);
+ return;
+ }
+
+ // Get the table name for this item
+ String tableName = getTable(item);
+ if (tableName == null) {
+ logger.error("Unable to store item '{}'.", item.getName());
+ return;
+ }
+
+ // Do some type conversion to ensure we know the data type.
+ // This is necessary for items that have multiple types and may return their
+ // state in a format that's not preferred or compatible with the MySQL type.
+ // eg. DimmerItem can return OnOffType (ON, OFF), or PercentType (0-100).
+ // We need to make sure we cover the best type for serialisation.
+ String value;
+ if (item instanceof ColorItem) {
+ value = item.getStateAs(HSBType.class).toString();
+ } else if (item instanceof RollershutterItem) {
+ value = item.getStateAs(PercentType.class).toString();
+ } else {
+ /*
+ * !!ATTENTION!!
+ *
+ * 1.
+ * DimmerItem.getStateAs(PercentType.class).toString() always returns 0
+ * RollershutterItem.getStateAs(PercentType.class).toString() works as expected
+ *
+ * 2.
+ * (item instanceof ColorItem) == (item instanceof DimmerItem) = true
+ * Therefore for instance tests ColorItem always has to be tested before DimmerItem
+ *
+ * !!ATTENTION!!
+ */
+
+ // All other items should return the best format by default
+ value = item.getState().toString();
+ }
+
+ // Get current timestamp
+ long timeNow = Calendar.getInstance().getTimeInMillis();
+ Timestamp timestamp = new Timestamp(timeNow);
+
+ String sqlCmd = null;
+ PreparedStatement statement = null;
+ try {
+ if (localtime) {
+ sqlCmd = new String(
+ "INSERT INTO " + tableName + " (TIME, VALUE) VALUES(?,?) ON DUPLICATE KEY UPDATE VALUE=?;");
+ statement = connection.prepareStatement(sqlCmd);
+ statement.setTimestamp(1, timestamp);
+ statement.setString(2, value);
+ statement.setString(3, value);
+ } else {
+ sqlCmd = new String(
+ "INSERT INTO " + tableName + " (TIME, VALUE) VALUES(NOW(),?) ON DUPLICATE KEY UPDATE VALUE=?;");
+ statement = connection.prepareStatement(sqlCmd);
+ statement.setString(1, value);
+ statement.setString(2, value);
+ }
+
+ statement.executeUpdate();
+
+ logger.debug("mySQL: Stored item '{}' as '{}'[{}] in SQL database at {}.", item.getName(),
+ item.getState().toString(), value, timestamp.toString());
+ logger.debug("mySQL: query: {}", sqlCmd);
+
+ // Success
+ errCnt = 0;
+ } catch (Exception e) {
+ errCnt++;
+
+ logger.error("mySQL: Could not store item '{}' in database with statement '{}': {}", item.getName(), sqlCmd,
+ e.getMessage());
+ } finally {
+ if (statement != null) {
+ try {
+ statement.close();
+ } catch (Exception hidden) {
+ }
+ }
+ }
+ }
+
+ @Override
+ public void store(Item item) {
+ store(item, null);
+ }
+
+ /**
+ * Checks if we have a database connection
+ *
+ * @return true if connection has been established, false otherwise
+ */
+ private boolean isConnected() {
+ // Check if connection is valid
+ try {
+ if (connection != null && !connection.isValid(5000)) {
+ errCnt++;
+ logger.error("mySQL: Connection is not valid!");
+ }
+ } catch (SQLException e) {
+ errCnt++;
+
+ logger.error("mySQL: Error while checking connection.", e);
+ }
+
+ // Error check. If we have 'errReconnectThreshold' errors in a row, then
+ // reconnect to the database
+ if (errReconnectThreshold != 0 && errCnt >= errReconnectThreshold) {
+ logger.error("mySQL: Error count exceeded {}. Disconnecting database.", errReconnectThreshold);
+ disconnectFromDatabase();
+ }
+ return connection != null;
+ }
+
+ /**
+ * Connects to the database
+ */
+ private void connectToDatabase() {
+ try {
+ // Reset the error counter
+ errCnt = 0;
+
+ logger.debug("mySQL: Attempting to connect to database {}", url);
+ Class.forName(driverClass).newInstance();
+ connection = DriverManager.getConnection(url, user, password);
+ logger.debug("mySQL: Connected to database {}", url);
+
+ int result;
+ try (Statement st = connection.createStatement()) {
+ result = st.executeUpdate("SHOW TABLES LIKE 'Items'");
+ }
+ if (waitTimeout != -1) {
+ logger.debug("mySQL: Setting wait_timeout to {} seconds.", waitTimeout);
+ try (Statement st = connection.createStatement()) {
+ st.executeUpdate("SET SESSION wait_timeout=" + waitTimeout);
+ }
+ }
+ if (result == 0) {
+ try (Statement st = connection.createStatement()) {
+ st.executeUpdate(
+ "CREATE TABLE Items (ItemId INT NOT NULL AUTO_INCREMENT,ItemName VARCHAR(200) NOT NULL,PRIMARY KEY (ItemId));",
+ Statement.RETURN_GENERATED_KEYS);
+ }
+ }
+
+ // Retrieve the table array
+ try (Statement st = connection.createStatement()) {
+ // Turn use of the cursor on.
+ st.setFetchSize(50);
+ try (ResultSet rs = st.executeQuery("SELECT ItemId, ItemName FROM Items")) {
+ while (rs.next()) {
+ sqlTables.put(rs.getString(2), "Item" + rs.getInt(1));
+ }
+ }
+ }
+ } catch (Exception e) {
+ logger.error("mySQL: Failed connecting to the SQL database using: driverClass={}, url={}, user={}",
+ driverClass, url, user, e);
+ }
+ }
+
+ /**
+ * Disconnects from the database
+ */
+ private void disconnectFromDatabase() {
+ if (connection != null) {
+ try {
+ connection.close();
+ logger.debug("mySQL: Disconnected from database {}", url);
+ } catch (Exception e) {
+ logger.error("mySQL: Failed disconnecting from the SQL database.", e);
+ }
+ connection = null;
+ }
+ }
+
+ /**
+ * Formats the given alias
by utilizing {@link Formatter}.
+ *
+ * @param alias
+ * the alias String which contains format strings
+ * @param values
+ * the values which will be replaced in the alias String
+ *
+ * @return the formatted value. All format strings are replaced by
+ * appropriate values
+ * @see java.util.Formatter for detailed information on format Strings.
+ */
+ protected String formatAlias(String alias, Object... values) {
+ return String.format(alias, values);
+ }
+
+ @Override
+ public Iterable query(FilterCriteria filter) {
+ if (!initialized) {
+ logger.debug("Query aborted on item {} - mySQL not initialised!", filter.getItemName());
+ return Collections.emptyList();
+ }
+
+ if (!isConnected()) {
+ connectToDatabase();
+ }
+
+ if (!isConnected()) {
+ logger.debug("Query aborted on item {} - mySQL not connected!", filter.getItemName());
+ return Collections.emptyList();
+ }
+
+ SimpleDateFormat mysqlDateFormat = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
+
+ // Get the item name from the filter
+ // Also get the Item object so we can determine the type
+ Item item = null;
+ String itemName = filter.getItemName();
+ logger.debug("mySQL query: item is {}", itemName);
+ try {
+ if (itemRegistry != null) {
+ item = itemRegistry.getItem(itemName);
+ }
+ } catch (ItemNotFoundException e1) {
+ logger.error("Unable to get item type for {}", itemName);
+
+ // Set type to null - data will be returned as StringType
+ item = null;
+ }
+
+ if (item instanceof GroupItem) {
+ // For Group Items is BaseItem needed to get correct Type of Value.
+ item = GroupItem.class.cast(item).getBaseItem();
+ }
+
+ String table = sqlTables.get(itemName);
+ if (table == null) {
+ logger.error("mySQL: Unable to find table for query '{}'.", itemName);
+ return Collections.emptyList();
+ }
+
+ String filterString = new String();
+
+ if (filter.getBeginDate() != null) {
+ if (filterString.isEmpty()) {
+ filterString += " WHERE";
+ } else {
+ filterString += " AND";
+ }
+ filterString += " TIME>'" + mysqlDateFormat.format(filter.getBeginDate()) + "'";
+ }
+ if (filter.getEndDate() != null) {
+ if (filterString.isEmpty()) {
+ filterString += " WHERE";
+ } else {
+ filterString += " AND";
+ }
+ filterString += " TIME<'" + mysqlDateFormat.format(filter.getEndDate().getTime()) + "'";
+ }
+
+ if (filter.getOrdering() == Ordering.ASCENDING) {
+ filterString += " ORDER BY Time ASC";
+ } else {
+ filterString += " ORDER BY Time DESC";
+ }
+
+ if (filter.getPageSize() != 0x7fffffff) {
+ filterString += " LIMIT " + filter.getPageNumber() * filter.getPageSize() + "," + filter.getPageSize();
+ }
+
+ try {
+ long timerStart = System.currentTimeMillis();
+
+ // Retrieve the table array
+ Statement st = connection.createStatement();
+
+ String queryString = new String();
+ queryString = "SELECT Time, Value FROM " + table;
+ if (!filterString.isEmpty()) {
+ queryString += filterString;
+ }
+
+ logger.debug("mySQL: query: {}", queryString);
+
+ // Turn use of the cursor on.
+ st.setFetchSize(50);
+
+ ResultSet rs = st.executeQuery(queryString);
+
+ long count = 0;
+ List items = new ArrayList();
+ State state;
+ while (rs.next()) {
+ count++;
+
+ if (item instanceof NumberItem) {
+ state = new DecimalType(rs.getDouble(2));
+ } else if (item instanceof ColorItem) {
+ state = new HSBType(rs.getString(2));
+ } else if (item instanceof DimmerItem) {
+ state = new PercentType(rs.getInt(2));
+ } else if (item instanceof SwitchItem) {
+ state = OnOffType.valueOf(rs.getString(2));
+ } else if (item instanceof ContactItem) {
+ state = OpenClosedType.valueOf(rs.getString(2));
+ } else if (item instanceof RollershutterItem) {
+ state = new PercentType(rs.getInt(2));
+ } else if (item instanceof DateTimeItem) {
+ Calendar calendar = Calendar.getInstance();
+ calendar.setTimeInMillis(rs.getTimestamp(2).getTime());
+ state = new DateTimeType(calendar);
+ } else {
+ state = new StringType(rs.getString(2));
+ }
+
+ MysqlItem mysqlItem = new MysqlItem(itemName, state, rs.getTimestamp(1));
+ items.add(mysqlItem);
+ }
+
+ rs.close();
+ st.close();
+
+ long timerStop = System.currentTimeMillis();
+ logger.debug("mySQL: query returned {} rows in {} ms", count, timerStop - timerStart);
+
+ // Success
+ errCnt = 0;
+
+ return items;
+ } catch (SQLException e) {
+ errCnt++;
+ logger.error("mySQL: Error running querying.", e);
+ }
+ return Collections.emptySet();
+ }
+
+ @Override
+ public @NonNull Set<@NonNull PersistenceItemInfo> getItemInfo() {
+ return Collections.emptySet();
+ }
+
+ @Override
+ public List getDefaultStrategies() {
+ return Collections.emptyList();
+ }
+}
diff --git a/bundles/org.openhab.persistence.rrd4j/.classpath b/bundles/org.openhab.persistence.rrd4j/.classpath
new file mode 100644
index 0000000000000..19368e503c1f5
--- /dev/null
+++ b/bundles/org.openhab.persistence.rrd4j/.classpath
@@ -0,0 +1,27 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/bundles/org.openhab.persistence.rrd4j/.project b/bundles/org.openhab.persistence.rrd4j/.project
new file mode 100644
index 0000000000000..8a11f4b94a724
--- /dev/null
+++ b/bundles/org.openhab.persistence.rrd4j/.project
@@ -0,0 +1,23 @@
+
+
+ org.openhab.persistence.rrd4j
+
+
+
+
+
+ org.eclipse.jdt.core.javabuilder
+
+
+
+
+ org.eclipse.m2e.core.maven2Builder
+
+
+
+
+
+ org.eclipse.jdt.core.javanature
+ org.eclipse.m2e.core.maven2Nature
+
+
diff --git a/bundles/org.openhab.persistence.rrd4j/.settings/org.eclipse.core.resources.prefs b/bundles/org.openhab.persistence.rrd4j/.settings/org.eclipse.core.resources.prefs
new file mode 100644
index 0000000000000..e9441bb123ec3
--- /dev/null
+++ b/bundles/org.openhab.persistence.rrd4j/.settings/org.eclipse.core.resources.prefs
@@ -0,0 +1,3 @@
+eclipse.preferences.version=1
+encoding//src/main/java=UTF-8
+encoding/=UTF-8
diff --git a/bundles/org.openhab.persistence.rrd4j/.settings/org.eclipse.jdt.core.prefs b/bundles/org.openhab.persistence.rrd4j/.settings/org.eclipse.jdt.core.prefs
new file mode 100644
index 0000000000000..29fe717b8794b
--- /dev/null
+++ b/bundles/org.openhab.persistence.rrd4j/.settings/org.eclipse.jdt.core.prefs
@@ -0,0 +1,11 @@
+eclipse.preferences.version=1
+org.eclipse.jdt.core.compiler.codegen.inlineJsrBytecode=enabled
+org.eclipse.jdt.core.compiler.codegen.targetPlatform=11
+org.eclipse.jdt.core.compiler.compliance=11
+org.eclipse.jdt.core.compiler.problem.assertIdentifier=error
+org.eclipse.jdt.core.compiler.problem.enablePreviewFeatures=disabled
+org.eclipse.jdt.core.compiler.problem.enumIdentifier=error
+org.eclipse.jdt.core.compiler.problem.forbiddenReference=warning
+org.eclipse.jdt.core.compiler.problem.reportPreviewFeatures=warning
+org.eclipse.jdt.core.compiler.release=disabled
+org.eclipse.jdt.core.compiler.source=11
diff --git a/bundles/org.openhab.persistence.rrd4j/.settings/org.eclipse.m2e.core.prefs b/bundles/org.openhab.persistence.rrd4j/.settings/org.eclipse.m2e.core.prefs
new file mode 100644
index 0000000000000..f897a7f1cb238
--- /dev/null
+++ b/bundles/org.openhab.persistence.rrd4j/.settings/org.eclipse.m2e.core.prefs
@@ -0,0 +1,4 @@
+activeProfiles=
+eclipse.preferences.version=1
+resolveWorkspaceProjects=true
+version=1
diff --git a/bundles/org.openhab.persistence.rrd4j/.settings/org.eclipse.pde.core.prefs b/bundles/org.openhab.persistence.rrd4j/.settings/org.eclipse.pde.core.prefs
new file mode 100644
index 0000000000000..184597228d4b5
--- /dev/null
+++ b/bundles/org.openhab.persistence.rrd4j/.settings/org.eclipse.pde.core.prefs
@@ -0,0 +1,4 @@
+#Mon Oct 11 21:06:38 CEST 2010
+eclipse.preferences.version=1
+pluginProject.extensions=false
+resolve.requirebundle=false
diff --git a/bundles/org.openhab.persistence.rrd4j/NOTICE b/bundles/org.openhab.persistence.rrd4j/NOTICE
new file mode 100644
index 0000000000000..6c17d0d8a455b
--- /dev/null
+++ b/bundles/org.openhab.persistence.rrd4j/NOTICE
@@ -0,0 +1,14 @@
+This content is produced and maintained by the openHAB project.
+
+* Project home: https://www.openhab.org
+
+== Declared Project Licenses
+
+This program and the accompanying materials are made available under the terms
+of the Eclipse Public License 2.0 which is available at
+https://www.eclipse.org/legal/epl-2.0/.
+
+== Source Code
+
+https://github.com/openhab/openhab-core
+
diff --git a/bundles/org.openhab.persistence.rrd4j/README.md b/bundles/org.openhab.persistence.rrd4j/README.md
new file mode 100644
index 0000000000000..0651864477ea9
--- /dev/null
+++ b/bundles/org.openhab.persistence.rrd4j/README.md
@@ -0,0 +1,154 @@
+# rrd4j Persistence
+
+The [rrd4j](https://github.com/rrd4j/rrd4j) Persistence service is based on a round-robin database.
+
+In contrast to a "normal" database such as db4o, a round-robin database does not grow in size - it has a fixed allocated size, which is used.
+This is accomplished by saving a fixed amount of datapoints and by doing data compression, which means that the older the data is, the less values are available.
+The data is kept in several "archives", each holding the data for its set timeframe in the set granularity.
+The start point for all archives is the actually saved datapoint.
+So while you might have a value every minute for the last 8 hours, you might only have one every day for the last year.
+
+This service cannot be directly queried, because of the data compression. You could not provide precise answers for all questions.
+
+NOTE: rrd4j is for storing numerical data only.
+Attempting to use rrd4j to store complex datatypes (e.g. for restore-on-startup) will not work.
+
+
+
+- [Configuration](#configuration)
+ - [Datasource types](#datasource-types)
+ - [Heartbeat, MIN, MAX](#heartbeat-min-max)
+ - [Step\(s\)](#steps)
+ - [Example](#example)
+- [Troubleshooting](#troubleshooting)
+
+
+
+## Configuration
+
+This service can be configured in the file `services/rrd4j.cfg`.
+
+| Property | Default | Required | Description |
+|----------|---------|:--------:|-------------|
+| ``.def | | | `,,[\|U],[\|U],`. For example, `COUNTER,900,0,U,300` |
+| ``.archives | | | `,,,`. For example, `AVERAGE,0.5,1,365:AVERAGE,0.5,7,300` |
+| ``.items | | | ``. For example, `Item1,Item2` |
+
+where:
+
+- Sections in `[square brackets]` are optional.
+- `` is a name you choose for the datasource.
+- See [Datasource types](#datasource-types) for an explanation of ``.
+- See [Heartbeat, MIN, MAX](#heartbeat-min-max) for an explanation of ``, ``, `` and `U`.
+- See [Step\(s\)](#steps) for an explanation of ``, ``, ``, ``, and ``.
+- `` is explained in
+
+Round-robin databases (RRDs) have fixed-length so-called "archives" for storing values.
+One RRD can have (in general) several datasources and each datasource can have several archives.
+openHAB only supports one datasource per RRD (i.e. per stored item), which is named DATASOURCE_STATE.
+Multiple configurations (with differing .items settings) can be used (see example below).
+
+### Datasource types
+
+Depending on the data to be stored, several types for datasources exist:
+
+- **COUNTER** represents a ever-incrementing value (historically this was used for packet counters or traffic counters on network interfaces, a typical home-automation application would be your electricity meter). If you store the values of this counter in a simple database and make a chart of that, you'll most likely see a nearly flat line, because the increments per time are small compared to the absolute value (e.g. your electricity meter reads 60567 kWh, and you add 0.5 kWh per hour, than your chart over the whole day will show 60567 at the start and 60579 at the end of your chart. That is nearly invisible. RRD4J helps you out and will display the difference from one stored value to the other (depending on the selected size). Please note that the persistence extensions will return difference instead of the actual values if you use this type; this especially leads to wrong values if you try to restoreOnStartup!
+- **GAUGE** represents the reading of e.g. a temperature sensor. You'll see only small deviation over the day and your values will be within a small range, clearly visible within a chart.
+- **ABSOLUTE** is like a counter, but RRD4J assumes that the counter is reset when the value is read. So these are basically the delta values between the reads.
+- **DERIVE** is like a counter, but it can also decrease and therefore have a negative delta.
+
+### Heartbeat, MIN, MAX
+
+Each datasource also has a value for heartbeat, minimum and maximum. This heartbeat setting helps the database to detect "missing" values, i.e. if no new value is stored after "heartbeat" seconds, the value is considered missing when charting. Minimum and maximum define the range of acceptable values for that datasource.
+
+### Step(s)
+
+Step (set in `.def=,