All events ingested by a listener are stored in a stream. You can locate this stream by navigating to your listener's Overview page in Data Connection. Once your data resides in a stream, several processing options are available.
Stream processing with Automate
Streams can be directly processed in Automate, enabling you to execute an action or function for each inbound event. You can create objects in your ontology, run AIP logic, or use sources in functions to interact with external systems, such as writing data back to the system that sent the event.
Processing streaming events with Automate is appropriate when:
The event stream is not high throughput
Your event processing is stateless
Latency of a few seconds is acceptable
At-least-once processing is sufficient
This approach is suitable for most listener integrations, providing low-cost, low-maintenance event processing workflows.
Streaming pipelines
Listener event streams can be processed with streaming pipelines as a lower-latency alternative. These pipelines support high-throughput streams, stateful event processing, and (optionally) exactly-once event processing. Streaming pipelines can additionally leverage UDFs to build powerful real-time event-processing workflows.