Introducing

Understanding Aggregate Bar Delays

Jun 29, 2020

Sometimes users reach out to us confused about why they receive an aggregate bar late or twice through our WebSocket feed. This behavior is intentional and designed for quality assurance purposes on our end, here’s why:

Trades are occasionally sent to us late. FINRA, for example, has up to 15 minutes to report a trade that happened on any of the dark pools. We deal with these delayed trades differently for second and minute aggregates.

Second Aggregation:
For second aggregates, we wait an additional 2 seconds for any high latency messages to come in. Once that two second period has passed, we broadcast the bar and hold onto it for another 15 minutes for more trades to come in that would affect that second.  It is important to note that we only hold that ‘buffer’ for 15 minutes. If any trades come in after that 15 minute buffer period, they will not be included until the end of the day.

Minute Aggregation:
For the minute aggregates stream, we broadcast the bars as we receive the trades and calculate them. If subsequent trades are received, we recalculate that bar and send it out again. Like the second aggregates, if a trade comes in after that 15-minute window, it will not be included in the candle until the end of the day.

Daily Aggregation:
For daily aggregates, we are continuously calculating the 1 daily bar, so this gets updated no matter how late the trade comes in.

Why Rebroadcast outdated information?
We do this to ensure that the data we broadcast via our Stock API is correct, even if there have been changes. We feel that rebroadcasting corrected data is better than dropping subsequent details.

If you have any more questions or need additional information, please don’t hesitate to contact us at support@polygon.io.

From the blog

See what's happening at polygon.io

hunting anomalies in the stock market Feature Image
tutorial

Hunting Anomalies in the Stock Market

This tutorial demonstrates how to detect short-lived statistical anomalies in historical US stock market data by building tools to identify unusual trading patterns and visualize them through a user-friendly web interface.