Ingesting Time Series Data
The Time Series service ingests data using the WebSocket protocol. You may use the Java client library to facilitate ingestion and to avoid setting up a WebSocket, or create your own Websocket connection if you prefer to use a language other than Java.
Ingest with the Java Client Library
After setting up your client, you may create Ingestion Requests using the Ingestion Request Builder. The following shows an example data ingestion request using the Java client. Library.
IngestionRequestBuilder ingestionBuilder = IngestionRequestBuilder.createIngestionRequest()
.withMessageId("<MessageID>")
.addIngestionTag(IngestionTag.Builder.createIngestionTag()
.withTagName("TagName")
.addDataPoints(
Arrays.asList(
new DataPoint(new Date().getTime(),Math.random(), Quality.GOOD),
new DataPoint(new Date().getTime(), "BadValue", Quality.BAD),
new DataPoint(new Date().getTime(), null,Quality.UNCERTAIN))
)
.addAttribute("AttributeKey", "AttributeValue")
.addAttribute("AttributeKey2", "AttributeValue2")
.build());
String json = ingestionBuilder.build().get(0);
IngestionResponse response = ClientFactory.ingestionClientForTenant(tenant).ingest(json);
String responseStr = response.getMessageId() + response.getStatusCode();
Ingest with your own WebSocket Connection
wss://<ingestion_url>
. You must also provide the following headers:- Authorization: Bearer <token from UAA>
- Predix-Zone-Id: <your zone id>
- Origin:
http://<your IP address or “localhost”>
https://github.com/PredixDev/timeseries-bootstrap/blob/master/src/main/java/com/ge/predix/solsvc/timeseries/bootstrap/client/TimeseriesClientImpl.java
<Predix-Zone-Id>
are included with the environment variables for your application when you bind your application to your Time Series service instance. To view the environment variables, on a command line, enter: cf env <application_name>
Example Data Ingestion Request
The following shows an example of the JSON payload for an ingestion request:
URL: wss://ingestion_url
Headers:
Authorization: Bearer <token from trusted issuer>
Predix-Zone-Id: <Predix-Zone-Id>
Origin: http://<origin-hostname>/
Request Payload:
{
"messageId": "<MessageID>",
"body":[
{
"name":"<TagName>",
"datapoints":[
[
<EpochInMs>,
<Measure>,
<Quality>
]
],
"attributes":{
"<AttributeKey>":"<AttributeValue>",
"<AttributeKey2>":"<AttributeValue2>"
}
}
]
}
Ingesting backFill data
When ingesting historical or non-live data, it is recommended to use the backFill option to avoid overloading the live data stream. Data ingested using this option will not be available to query immediately.
The following JSON is an example of how to use the backFill option.
URL: wss://ingestion_url
Headers:
Authorization: Bearer <token from trusted issuer>
Predix-Zone-Id: <Predix-Zone-Id>
Origin: http://<origin-hostname>/
Request Payload:
{
"messageId": "<MessageID>",
"body":[
{
"name":"<TagName>",
"datapoints":[
[
<EpochInMs>,
<Measure>,
<Quality>
]
],
"attributes":{
"<AttributeKey>":"<AttributeValue>",
"<AttributeKey2>":"<AttributeValue2>"
}
}
],
"backFill": true
}
Ingestion Nuances
- In previous releases, quality was supported as an attribute, but starting with this release,you must explicitly provide quality in each datapoint, along with the timestamp and measurement
- The Time Series service now accepts compressed (GZIP) JSON payloads. The size limit for the actual JSON payload is 512 KB regardless of the ingestion request format. For compressed payloads, this means that the decompressed payload cannot exceed 512 KB.
- The <Message Id> can be a string or integer, and must be unique. When using an integer the <MessageID> should be between 0 and 264 (18446744073709551616).
- The
<BackFill>
must be a boolean. The ingestion request will return a 400 (Bad Request) status code if the<BackFill>
is sent as any other data type. This is an optional parameter, and its value will be false if not specified.
Acknowledgement Message
{
"messageId": <MessageID>,
"statusCode": <AcknowledgementStatusCode>
}
Code | Message |
---|---|
202 | Accepted successfully |
400 | Bad request |
401 | Unauthorized |
413 |
Request entity too large Note: The payload cannot exceed512KB. |
503 | Failed to ingest data |
Tips for Data Ingestion
Spread ingestion requests over multiple connections, as sending numerous payloads over the same connection increases wait time for data availability over the query service. Also, be aware of our data plan when using multiple connections.