In some cases, you may need to perform certain aggregations of data from streams of data in order to reduce incoming data traffic or derive insights at the edge. Some use cases of this stream processing include the following:
Group all incoming Nginx log by HTTP code
Snapshot surrounding messages when an "error" is found
Calculate the average, maximum, and minimum of a response time from a log message
Defining a streams file
Calyptia Core automatically configures a stream file by default for every pipeline and you only need to define new Stream Tasks on top of incoming data. For a full list of directives and capabilities of the stream processor be sure to check Fluent Bit documentation
The following stream's file counts the number of records in a 5 second period.