Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rounding error when reading long from db and then writing it back. #440

Open
kirillsc opened this issue Apr 9, 2018 · 10 comments
Open

Rounding error when reading long from db and then writing it back. #440

kirillsc opened this issue Apr 9, 2018 · 10 comments
Assignees
Milestone

Comments

@kirillsc
Copy link

kirillsc commented Apr 9, 2018

Hi,

I am observing rounding errors on integer fields when reading from database and writing them back.

I have a measurement with an integer column that represents a timestamps with nanosecond resolution (note, this is not the default time column).

SHOW FIELD KEYS ON db
name: latency_1
fieldKey   fieldType
--------   ---------
latency_ms integer
msg_name   string
msg_type   integer
path_id    integer
seq_no     integer
ts_egr     integer  <- timestamp_ns
ts_ingr    integer

I am reading the data from database using influxdb-java. All integer values that I am reading are Doubles now (this I believe related to this issue).

String qstatement = String.format(
            "SELECT * FROM %s WHERE time > now() - %dms ORDER BY time ",
            table_name, how_far_back_in_time_ms);
Query query = new Query(qstatement, db_name);
influxDB.query(query, TimeUnit.NANOSECONDS);

For completeness, before writing values back to db, I am adding one more column where I take original value of the "ts_egr" column and convert it to Long before writing it back, i.e.,:

Double ts_egr = (Double)obj; //original "ts_egr" value of the column
Long long_ts = new Long(ts_egr.longValue());

Now I am reading a row from the original table and the new table in InfluxDB:

> select * from latency_1 where seq_no = 70
name: latency_1
time                latency_ms msg_name             msg_type path_id seq_no ts_egr              ts_ingr
----                ---------- --------             -------- ------- ------ ------              -------
1523264395509000000 72         abcdefgh12345678abcd 1        1       70     1523264395509072018 1523264395509000000
> select * from modified_latency_1 where seq_no = 70
name: modified_latency_1
time                latency_ms msg_name             msg_type path_id seq_no ts_egr              ts_egr_long         ts_ingr
----                ---------- --------             -------- ------- ------ ------              -----------         -------
1523264395508999936 72         abcdefgh12345678abcd 1        1       70     1523264395509072100 1523264395509072128 1523264395509000000

Note, the original value of "ts_ingr" 1523264395509072018 has been changed to 1523264395509072100 (just by reading and writing it back) and if we do conversion back to Long in Java we get another variant 1523264395509072128. Overall, I was observing rounding errors up to 100 nanoseconds.

Am I doing something wrong here, and is there a way to avoid the rounding errors?

Thank you

@fmachado
Copy link
Contributor

fmachado commented Apr 9, 2018

@kirillsc just fyi: not completely related with your issue but InfluxDB supports cast operations[1] on float->integer and integer->float since v1.0.

[1] https://docs.influxdata.com/influxdb/v1.5/query_language/data_exploration/#cast-operations

@fmachado
Copy link
Contributor

fmachado commented Apr 10, 2018

@kirillsc unfortunately I don't think there is an easy way to fix your issue and be backward compatible. The current JSON parser[1] used by influxdb-java is not "smart" enough to handle non-floating[2] point numbers as integer or long (or even BigInteger as Jackson does).
Let's see if the Moshi devs can help us here.
[1] square/moshi#192
[2] https://github.com/square/moshi/blob/master/moshi/src/main/java/com/squareup/moshi/JsonReader.java#L440

@kirillsc
Copy link
Author

Thank you for the clarification and the links!

@fmachado
Copy link
Contributor

Accordingly[1] with a Moshi developer, with the next (unreleased) version we may have our own custom JsonAdapter for numbers.

[1] square/moshi#192 (comment)

@agolovenko
Copy link

Any update on this bug?

1 similar comment
@agawande72
Copy link

Any update on this bug?

@majst01
Copy link
Collaborator

majst01 commented Mar 27, 2019

moshi 1.6 is released with this feature @fmachado do you know exactly what to do ?

@majst01
Copy link
Collaborator

majst01 commented Mar 27, 2019

I create a branch which sets dependency to moshi 1.8.0, any ideas howto implement this explicit cast welcome

majst01 added a commit that referenced this issue Mar 27, 2019
@fmachado fmachado self-assigned this Apr 2, 2019
@fmachado
Copy link
Contributor

Quick update on this issue: I manage to fix this and I will commit my changes later today so you can review @majst01

@fmachado
Copy link
Contributor

fmachado commented Apr 17, 2019

tl;dr: if you are using InfluxDB 1.4+, use the ResponseFormat.MessagePack when creating your InfluxDB client.

@majst01 I pushed some code to your branch so we can all discuss here about 5f5b9ea

Points for consideration:

  1. Moshi can handle long/integers appropriately but for some reason unknown to me, when dealing with a collection (in our case, the List<Object> from QueryResult.Result.Series), it will always handle numbers as Double.
    There are internal different adapters for data types and Moshi is identifying if the sequence of characters is a number with fraction or not but this is ignored when a collection of objects is used;

  2. Moshi is following the RFC-7159 as mentioned by one of their developers.

  3. The JSON created by InfluxDB is not following any RFC that I know and the JSON created may have integers that are not compliant with RFC-7159:

  • InfluxDB integers are "Signed 64-bit integers (-9223372036854775808 to 9223372036854775807)"
  • RFC-7159 section 6 "(...)numbers that are integers and are in the range [-(2**53)+1, (2**53)-1] are interoperable in the sense that implementations will agree exactly on their numeric values."
    Moshi is not following this spec (for now).
  1. I've not tested for performance degradation with the new JsonAdapter and I hate doing the same thing Moshi internally already did before (peeking for fp or int/long);

  2. breaking change: my patch it's not backward compatible. Numbers that are not fp will now correctly return as Long instead of the (wrong) Double.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants