Manage Deployment Schedules
About Analytics Deployment Schedules
Use the Schedule step to create a schedule for running a deployed analytic or execute it on-demand.
Unit | Range | Applicable to |
---|---|---|
Seconds | Select a numerical value from 1 to 59. |
|
Minutes | Select a numerical value from 1 to 59. |
|
Hours | Select a numerical value from 1 to 23. |
|
Days | Select one day. |
|
Samples | Select a numerical value for the number of samples before the offset. |
|
This request | Means this |
---|---|
Offset | The amount of time in the past to begin the data pull. |
Sample Duration (Batch Size) | The amount of data being pulled before the Offset. |
Time Span | The time span for which the data is being pulled. |
Sampling Interval | The interval between data points. |
Schedule a Recurrent Analytic
You can set an analytic to run continuously, define how often it runs, define how far back in time the data is requested, define how large the sample is and how often the data is sampled.
Before You Begin
Procedure
What To Do Next
Schedule an Analytic On Demand
You can schedule an analytic to run on-demand and configure how the data is sampled.
Before You Begin
Procedure
What To Do Next
Schedule a Streaming Analytic
Tenant-specific option to deploy and run analytics for continuous streaming.
Before You Begin
This procedure assumes that the following prerequisite tasks have been completed.
- You have uploaded the required assets, tags and time series data required for this analytic.
- Your tenant has been configured to stream to Predix timeseries.
- You have the required access permissions to manage analytics on Spark runtime for the OPM tenant.
- You have added and configured a streaming analytic in the analytic catalog.
- You have created a deployment for that streaming analytics.
- You have applied the deployment to the applicable assets through asset selection.
- You have mapped tags to the inputs, constants and outputs before deploying the analytic job to the Spark runtime environment
About This Task
The streaming schedule is only available for tenants that are configured to support big data streaming through Spark runtime cluster environment. Depending on the tenant configuration, this option may not be available. Use the Schedule section to deploy and run your forecasting and streaming analytic jobs. You can run the deployed job on demand or through continuous streaming. Once started, the streaming job runs continuously.
Once you have selected the tags to map, you can continue to the 4. Review tab, or save and close. Navigating by selecting outside the deployment screen saves and closes the access deployment. Select the Deployment Name link to continue your deployment.
Procedure
Results
Schedule an Orchestration
You can schedule an orchestration to run at regular intervals, or run the orchestration on demand.
About This Task
Similar to any analytic deployment, you can also use the 3. Schedule section of the deployment to run orchestrations.
- Data request parameters can be set on per step basis. Data request parameters, such as Start Time, End Time, for Run Once, Offset before schedule, and Sample duration, which are set at orchestration level, apply to all steps by default. However, you can set the parameters on per step basis and override these parameters.
- In addition to the Historian mode, the following modes are available in the Time Series data:
- Math
- Sample and Hold
- Runtime supported for Predix Insights only.
Unit | Range | Applies to the following fields |
---|---|---|
Seconds | 1-59 |
|
Minutes | 1-59 |
|
Hours | 1-24 |
|
Days | Select one or more days of the week. |
|
Months | 1-12 | Offset |
Years | Select a numerical value. | Offset |
Samples | Select a numerical value for the number of samples before the offset. | Sample Duration (Batch Size) |
This request | Means this |
---|---|
Offset | The amount of time in the past to begin the data pull. |
Sample Duration (Batch Size) | The amount of data being sampled before the Offset. |
Time Span | The time span for which the data is being sampled. |
Sampling Interval | The interval between data points. |
Procedure
Schedule a Analytic to Trigger on an Event
Subject to your tenant configuration, you can trigger an analytic to run when a specific alert event happens.
Before You Begin
This procedure assumes that the following prerequisite tasks have been completed.
- You have uploaded the required assets, tags, and time series data required for this analytic.
- You have alert templates defined for your tenant.
- Your tenant has been configured to enable the PowerDataFabric runtime.
- You have the required access permissions to manage analytics.
- You have added and configured an analytic template for the PowerDataFabric runtime in the analytic catalog.
- You have created a deployment for that analytic template.
- You have applied the deployment to the applicable assets through asset selection.
- You have mapped tags to the inputs, constants, and outputs before deploying the analytic job.
About This Task
The triggering schedule is only available for tenants that are configured to support event trigger through the PowerDataFabric runtime environment. Depending on the tenant configuration, this option may not be available. Use the Schedule section to deploy and trigger analytic jobs. You can run the deployed job on demand or through event trigger. Once started, the job is only triggered when one or more of the configured alert event happens.
Once you have selected the tags to map, you can continue to the 4. Review section, or save and close. Navigating by selecting outside the deployment screen saves and closes the deployment.