Calyptia Core Agent
Support PortalCalyptia Core
23.10
23.10
  • Calyptia Core Agent Documentation
  • Comparison to Fluent Bit
  • Performance and Benchmarking
  • Concepts
    • Key Concepts
    • Buffering
    • Data Pipeline
      • Input
      • Parser
      • Filter
      • Buffer
      • Router
      • Output
  • Installation
    • Getting Started with Calyptia Core Agent
    • Supported Platforms
    • Linux Packages
      • Amazon Linux
      • Redhat / CentOS
      • Debian
      • Ubuntu
    • Docker
    • Kubernetes
    • macOS
    • Windows
  • Administration
    • Configuring Calyptia Core Agent
      • Classic mode
        • Format and Schema
        • Configuration File
        • Variables
        • Commands
        • Upstream Servers
        • Record Accessor
      • YAML Configuration File
      • Unit Sizes
      • Multiline Parsing
    • Transport Security
    • Buffering & Storage
    • Backpressure
    • Scheduling and Retries
    • Networking
    • Memory Management
    • Monitoring
    • HTTP Proxy
    • Hot Reload
    • Troubleshooting
  • Local Testing
    • Validating your Data and Structure
    • Running a Logging Pipeline Locally
  • Data Pipeline
    • Inputs
      • Collectd
      • CPU Log Based Metrics
      • Disk I/O Log Based Metrics
      • Docker Log Based Metrics
      • Docker Events
      • Dummy
      • Elasticsearch
      • Exec
      • Exec Wasi
      • Fluent Bit Metrics
      • Forward
      • Head
      • HTTP
      • Kafka
      • Health
      • Kernel Logs
      • Memory Metrics
      • MQTT
      • Network I/O Log Based Metrics
      • NGINX Exporter Metrics
      • Node Exporter Metrics
      • Podman Metrics
      • Process Log Based Metrics
      • Prometheus Scrape Metrics
      • Random
      • Serial Interface
      • Standard Input
      • StatsD
      • Syslog
      • Systemd
      • Tail
      • TCP
      • Thermal
      • OpenTelemetry
      • Windows Event Log
      • Windows Event Log (winevtlog)
      • Windows Exporter Metrics
    • Parsers
      • Configuring Parser
      • JSON
      • Regular Expression
      • LTSV
      • Logfmt
      • Decoders
    • Filters
      • AWS Metadata
      • CheckList
      • ECS Metadata
      • Expect
      • GeoIP2 Filter
      • Grep
      • Kubernetes
      • Log to Metrics
      • Lua
      • Parser
      • Record Modifier
      • Modify
      • Multiline
      • Nest
      • Nightfall
      • Rewrite Tag
      • Standard Output
      • Throttle
      • Tensorflow
      • Wasm
    • Outputs
      • Amazon CloudWatch
      • Amazon Kinesis Data Firehose
      • Amazon Kinesis Data Streams
      • Amazon S3
      • Azure Blob
      • Azure Data Explorer
      • Azure Log Analytics
      • Counter
      • Datadog
      • Elasticsearch
      • File
      • FlowCounter
      • Forward
      • GELF
      • Google Cloud BigQuery
      • HTTP
      • InfluxDB
      • Kafka
      • Kafka REST Proxy
      • LogDNA
      • Loki
      • NATS
      • New Relic
      • NULL
      • Observe
      • OpenSearch
      • OpenTelemetry
      • PostgreSQL
      • Prometheus Exporter
      • Prometheus Remote Write
      • SkyWalking
      • Slack
      • Splunk
      • Stackdriver
      • Standard Output
      • Syslog
      • TCP & TLS
      • Treasure Data
      • Vivo Exporter
      • WebSocket
  • Calyptia Core Agent for Developers
    • Golang Output Plugins
    • WASM Filter Plugins
    • WASM Input Plugins
Powered by GitBook
On this page
  • Configuration Parameters
  • Auto Enrichment & Data Discovery
  • Getting Started
  • Query your Data in LogDNA
  1. Data Pipeline
  2. Outputs

LogDNA

PreviousKafka REST ProxyNextLoki

Last updated 11 months ago

is an intuitive cloud based log management system that provides you an easy interface to query your logs once they are stored.

The Calyptia Fluent Bit logdna output plugin allows you to send your log or events to a compliant service like:

Before to get started with the plugin configuration, make sure to obtain the proper account to get access to the service. You can start with a free trial in the following link:

Configuration Parameters

Key
Description
Default

logdna_host

LogDNA API host address

logs.logdna.com

logdna_port

LogDNA TCP Port

443

api_key

API key to get access to the service. This property is mandatory.

hostname

Name of the local machine or device where Calyptia Fluent Bit is running.

When this value is not set, Calyptia Fluent Bit lookup the hostname and auto populate the value. If it cannot be found, an unknown value will be set instead.

mac

Mac address. This value is optional.

ip

IP address of the local hostname. This value is optional.

tags

A list of comma separated strings to group records in LogDNA and simplify the query with filters.

file

Optional name of a file being monitored. Note that this value is only set if the record do not contain a reference to it.

app

Name of the application. This value is auto discovered on each record, if not found, the default value is used.

Calyptia Fluent Bit

Auto Enrichment & Data Discovery

One of the features of Calyptia Fluent Bit + LogDNA integration is the ability to auto enrich each record with further context.

When the plugin process each record (or log), it tries to lookup for specific key names that might contain specific context for the record in question, the following table describe the keys and the discovery logic:

Key
Description

level

If the record contains a key called level or severity, it will populate the context level key with that value. If not found, the context key is not set.

file

if the record contains a key called file, it will populate the context file with the value found, otherwise If the plugin configuration provided a file property, that value will be used instead (see table above).

app

If the record contains a key called app, it will populate the context app with the value found, otherwise it will use the value set for app in the configuration property (see table above).

meta

if the record contains a key called meta, it will populate the context meta with the value found.

Getting Started

The following configuration example, will emit a dummy example record and ingest it on LogDNA. Copy and paste the following content in a file called logdna.conf:

[SERVICE]
    flush     1
    log_level info

[INPUT]
    name      dummy
    dummy     {"log":"a simple log message", "severity": "INFO", "meta": {"s1": 12345, "s2": true}, "app": "Calyptia Fluent Bit"}
    samples   1

[OUTPUT]
    name      logdna
    match     *
    api_key   YOUR_API_KEY_HERE
    hostname  my-hostname
    ip        192.168.1.2
    mac       aa:bb:cc:dd:ee:ff
    tags      aa, bb

run Calyptia Fluent Bit with the new configuration file:

calyptia-fluent-bit -c logdna.conf

Calyptia Fluent Bit output:

Calyptia Fluent Bit 20.10.03
[2020/04/07 17:44:37] [ info] [storage] version=1.0.3, initializing...
[2020/04/07 17:44:37] [ info] [storage] in-memory
[2020/04/07 17:44:37] [ info] [storage] normal synchronization mode, checksum disabled, max_chunks_up=128
[2020/04/07 17:44:37] [ info] [engine] started (pid=2157706)
[2020/04/07 17:44:37] [ info] [output:logdna:logdna.0] configured, hostname=monox-fluent-bit-2
[2020/04/07 17:44:37] [ info] [sp] stream processor started
[2020/04/07 17:44:38] [ info] [output:logdna:logdna.0] logs.logdna.com:443, HTTP status=200
{"status":"ok","batchID":"f95849a8-ec6c-4775-9d52-30763604df9b:40710:ld72"}

Your record will be available and visible in your LogDNA dashboard after a few seconds.

Query your Data in LogDNA

In your LogDNA dashboard, go to the top filters and mark the Tags aa and bb, then you will be able to see your records as the example below:

LogDNA
LogDNA
LogDNA
IBM Log Analysis
LogDNA Sign Up