You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Metadata values may have the type UINT64 (or INT64). JavaScript numbers internally use double. This means that integer values larger than ~253 lose precision.
This is the maximum value that a UINT64 may have. When this value is parsed (using standard JSON.parse), it is converted to a number value. Due to the rounding and precision loss, this value is larger than the actual maximum. So this currently causes a validation error:
{
"type": "VALUE_NOT_IN_RANGE",
"path": "/metadata/example_UINT64_SCALAR",
"message": "The value has type UINT64 and must be in [0,18446744073709551615], but is 18446744073709552000",
"severity": "ERROR"
},
It is theoretically possible to avoid this, but only within the validator: The validator could use https://www.npmjs.com/package/json-bigint for parsing, and obtain the exact values. But this will break as soon as there will be an option to validate an (already parsed) object that represents the tileset: We cannot force clients to use json-bigint. They should be able to still use JSON.parse, even if this may then only contain the rounded values.
The severity level of this particular check should probably be reduced from ERROR to WARNING, and adjusted to have an appropriate message (Roughly: "Yes, this is probably valid, but not very portable"). This is not entirely straightforward, though: One could argue that the handling should be different depending on whether the values have been obtained from a metadata entity in JSON, or whether they have been read from binary metadata. The latter can represent these values, and clients can fetch the value as number or bigint, depending on the implementation.
The text was updated successfully, but these errors were encountered:
(EDIT: What originally was written here did not make sense, because it referred to a case that was disallowed by the specification. But the core of the question remains, and has been updated accordingly)
When these minimum/maximum values are supposed to be computed from the property table, one would expect them to be computed with bigint for the (U)INT64 types
They expected minimum/maximum values are defined in JSON, meaning that they are limited to number
Comparing bigint values to number does not make sense in many cases.
Metadata values may have the type
UINT64
(orINT64
). JavaScript numbers internally usedouble
. This means that integer values larger than ~253 lose precision.An example of this (intentionally!) appears in the
TilesetWithFullMetadata
sample :This is the maximum value that a
UINT64
may have. When this value is parsed (using standardJSON.parse
), it is converted to anumber
value. Due to the rounding and precision loss, this value is larger than the actual maximum. So this currently causes a validation error:It is theoretically possible to avoid this, but only within the validator: The validator could use https://www.npmjs.com/package/json-bigint for parsing, and obtain the exact values. But this will break as soon as there will be an option to validate an (already parsed) object that represents the tileset: We cannot force clients to use
json-bigint
. They should be able to still useJSON.parse
, even if this may then only contain the rounded values.The severity level of this particular check should probably be reduced from
ERROR
toWARNING
, and adjusted to have an appropriate message (Roughly: "Yes, this is probably valid, but not very portable"). This is not entirely straightforward, though: One could argue that the handling should be different depending on whether the values have been obtained from a metadata entity in JSON, or whether they have been read from binary metadata. The latter can represent these values, and clients can fetch the value asnumber
orbigint
, depending on the implementation.The text was updated successfully, but these errors were encountered: