Calyptia Core Agent
22.10
22.10
  • Calyptia Fluent Bit v22.10 Documentation
  • Differences with Open Source
  • Performance and Benchmarking
  • Concepts
    • Key Concepts
    • Buffering
    • Data Pipeline
      • Input
      • Parser
      • Filter
      • Buffer
      • Router
      • Output
  • Installation
    • Getting Started with Calyptia Fluent Bit
    • Supported Platforms
    • Linux Packages
      • Amazon Linux
      • Redhat / CentOS
      • Debian
      • Ubuntu
    • Docker
    • Kubernetes
    • Windows
  • Administration
    • Configuring Calyptia Fluent Bit
      • Classic mode
        • Format and Schema
        • Configuration File
        • Variables
        • Commands
        • Upstream Servers
        • Record Accessor
      • Unit Sizes
      • Multiline Parsing
    • Transport Security
    • Buffering & Storage
    • Backpressure
    • Scheduling and Retries
    • Networking
    • Memory Management
    • Monitoring
    • HTTP Proxy
    • Troubleshooting
  • Local Testing
    • Validating your Data and Structure
    • Running a Logging Pipeline Locally
  • Data Pipeline
    • Inputs
      • Collectd
      • CPU Log Based Metrics
      • Disk I/O Log Based Metrics
      • Docker Log Based Metrics
      • Docker Events
      • Dummy
      • Exec
      • Exec Wasi
      • Fluent Bit Metrics
      • Forward
      • Head
      • HTTP
      • Health
      • Kernel Logs
      • Memory Metrics
      • MQTT
      • Network I/O Log Based Metrics
      • NGINX Exporter Metrics
      • Node Exporter Metrics
      • Process Log Based Metrics
      • Prometheus Scrape Metrics
      • Random
      • Serial Interface
      • Standard Input
      • StatsD
      • Syslog
      • Systemd
      • Tail
      • TCP
      • Thermal
      • OpenTelemetry
      • Wasm Input pulgin for developers
      • Windows Event Log
      • Windows Event Log (winevtlog)
      • Windows Exporter Metrics
    • Parsers
      • Configuring Parser
      • JSON
      • Regular Expression
      • LTSV
      • Logfmt
      • Decoders
    • Filters
      • AWS Metadata
      • CheckList
      • ECS Metadata
      • Expect
      • GeoIP2 Filter
      • Grep
      • Kubernetes
      • Lua
      • Parser
      • Record Modifier
      • Modify
      • Multiline
      • Nest
      • Nightfall
      • Rewrite Tag
      • Standard Output
      • Throttle
      • Tensorflow
      • Wasm
      • Wasm filter plugin for developers
    • Outputs
      • Amazon CloudWatch
      • Amazon Kinesis Data Firehose
      • Amazon Kinesis Data Streams
      • Amazon S3
      • Azure Blob
      • Azure Data Explorer
      • Azure Log Analytics
      • Counter
      • Datadog
      • Elasticsearch
      • File
      • FlowCounter
      • Forward
      • GELF
      • Golang Output plugin for developers
      • Google Cloud BigQuery
      • HTTP
      • InfluxDB
      • Kafka
      • Kafka REST Proxy
      • LogDNA
      • Loki
      • NATS
      • New Relic
      • NULL
      • Observe
      • OpenSearch
      • OpenTelemetry
      • PostgreSQL
      • Prometheus Exporter
      • Prometheus Remote Write
      • SkyWalking
      • Slack
      • Splunk
      • Stackdriver
      • Standard Output
      • Syslog
      • TCP & TLS
      • Treasure Data
      • WebSocket
Powered by GitBook
On this page
  • Configuration Parameters
  • Getting Started
  • Command Line:
  • Configuration File
  1. Data Pipeline
  2. Outputs

Treasure Data

PreviousTCP & TLSNextWebSocket

Last updated 2 years ago

The td output plugin, allows to flush your records into the cloud service.

Configuration Parameters

The plugin supports the following configuration parameters:

Key
Description
Default

API

Database

Specify the name of your target database.

Table

Specify the name of your target table where the records will be stored.

Region

Set the service region, available values: US and JP

US

Getting Started

In order to start inserting records into , you can run the plugin from the command line or through the configuration file:

Command Line:

$ calyptia-fluent-bit -i cpu -o td -p API="abc" -p Database="fluentbit" -p Table="cpu_samples"

Ideally you don't want to expose your API key from the command line, using a configuration file is highly desired.

Configuration File

In your main configuration file append the following Input & Output sections:

[INPUT]
    Name cpu
    Tag  my_cpu

[OUTPUT]
    Name     td
    Match    *
    API      5713/e75be23caee19f8041dfa635ddfbd0dcd8c8d981
    Database fluentbit
    Table    cpu_samples

The API key. To obtain it please log into the and in the API keys box, copy the API key hash.

Treasure Data
Treasure Data
Treasure Data
Console