Thanks for using Anomaly Detector.
The Anomaly Detector API's batch detection endpoint lets you detect anomalies through your entire times series data. In this detection mode, a single statistical model is created and applied to each point in the data set. If your time series has the below characteristics, we recommend using batch detection to preview your data in one API call.
- A seasonal time series, with occasional anomalies.
- A flat trend time series, with occasional spikes/dips.
We don't recommend using batch anomaly detection for real-time data monitoring, or using it on time series data that doesn't have above characteristics.
Batch detection creates and applies only one model, the detection for each point is done in the context of whole series.
If the time series data trends up and down without seasonality, some points of change (dips and spikes in the data) may be missed by the model. Similarly, some points of change that are less significant than ones later in the data set may not be counted as significant enough to be incorporated into the model.
Batch detection is slower than detecting the anomaly status of the latest point when doing real-time data monitoring, because of the number of points being analyzed.
For real-time data monitoring, we recommend detecting the anomaly status of your latest data point only. By continuously applying latest point detection, streaming data monitoring can be done more efficiently and accurately.
The example below describes the impact these detection modes can have on performance. The first picture shows the result of continuously detecting the anomaly status latest point along 28 previously seen data points. The red points are anomalies.
An image showing anomaly detection using the latest point
Below is the same data set using batch anomaly detection. The model built for the operation has ignored several anomalies, marked by rectangles.
An image showing anomaly detection using the batch method
Thanks again, we will add the info into public documentation of AD service.