Tensorflow
Last updated
Last updated
Tensorflow Filter is disabled by default in Calyptia Fluent Bit. If you need any assistance with this filter, please contact the .
Tensorflow Filter allows running Machine Learning inference tasks on the records of data coming from input plugins or stream processor. This filter uses as the inference engine, and requires Tensorflow Lite shared library to be present during build and at runtime.
Tensorflow Lite is a lightweight open-source deep learning framework that is used for mobile and IoT applications. Tensorflow Lite only handles inference (not training), therefore, it loads pre-trained models (.tflite
files) that are converted into Tensorflow Lite format (FlatBuffer
). You can read more on converting Tensorflow models
The plugin supports the following configuration parameters:
Clone , install bazel package manager, and run the following command in order to create the shared library:
The script creates the shared library bazel-bin/tensorflow/lite/c/libtensorflowlite_c.so
. You need to copy the library to a location (such as /usr/lib
) that can be used by Calyptia Fluent Bit.
If Tensorflow plugin initializes correctly, it reports successful creation of the interpreter, and prints a summary of model's input/output types and dimensions.
Currently supports single-input models
Uses Tensorflow 2.3 header files
Tensorflow filter plugin is disabled by default. You need to build Calyptia Fluent Bit with Tensorflow plugin enabled. In addition, it requires access to Tensorflow Lite header files to compile. Therefore, you also need to pass the address of the Tensorflow source code on your machine to the :
input_field
Specify the name of the field in the record to apply inference on.
model_file
Path to the model file (.tflite
) to be loaded by Tensorflow Lite.
include_input_fields
Include all input filed in filter's output
True
normalization_value
Divide input values to normalization_value