• 一个filebeat实例 设置多topic设置


    方法1:一实例多topic:

    https://discuss.elastic.co/t/filebeat-5-0-output-to-kafka-multiple-topics/67934

    The document_type per prospector becomes the event field type. That's why the filter won't match.
    
    Instead of conditionals consider using the format string like:
    filebeat.prospectors: - ... document_type: myapp_applog - ... document_type: myapp_applog_stats - ... document_type: myapp_elblog output.kafka: topic: '%{[type]}' # use document_type to set topic

    btw. the topic field in conditionals also supports format strings.
    cat filebeat.yml

    output.kafka: enabled: true hosts: ["192.168.111.107:9092","192.168.111.108:9092","192.168.111.109:9092"] topic: '%{[type]}' filebeat.prospectors: #------------------------------ Log prospector -------------------------------- - paths: - /data/logs/financial-manager/server0.log fields: app_id: financial-manager multiline.pattern: '^[0-9]{4}-[0-9]{2}-[0-9]{2}s(20|21|22|23|[0-1]d):[0-5]d:[0-5]d.' multiline.negate: true multiline.match: after document_type: log-log4j - paths: - /data/logs/user-service/*.log fields: app_id: user-service multiline.pattern: '^[0-9]{4}-[0-9]{2}-[0-9]{2}s(20|21|22|23|[0-1]d):[0-5]d:[0-5]d.' multiline.negate: true multiline.match: after document_type: log-iis - paths: - /data/logs/financial-manager/access_log.log fields: app_id: financial-manager document_type: log-undertow

    方法二:多filebeat实例,每个filebeat实例设置一个topic

    运行多个filebeat实例,每个实例对应一个配置文件

    cat filebeat.yml
    
    output.kafka:
      enabled: true
      hosts: ["192.168.111.107:9092","192.168.111.108:9092","192.168.111.109:9092"]
      topic: log-log4j
    
    
    filebeat.prospectors:
    #------------------------------ Log prospector --------------------------------
    - paths:
        - /data/logs/subscribe-consumer/server0.log
      fields:
        app_id: subscribe-consumer 
    
      multiline.pattern: '^[0-9]{4}-[0-9]{2}-[0-9]{2}s(20|21|22|23|[0-1]d):[0-5]d:[0-5]d.'
      multiline.negate: true
      multiline.match: after
    cat filebeat2.yml
    
    output.kafka:
      enabled: true
      hosts: ["192.168.111.107:9092","192.168.111.108:9092","192.168.111.109:9092"]
      topic: log-tomcat
    
    
    filebeat.prospectors:
    #------------------------------ Log prospector --------------------------------
    - paths:
        - /data/logs/subscribe-consumer/server0.log
      fields:
        app_id: subscribe-consumer 
    
      multiline.pattern: '^[0-9]{4}-[0-9]{2}-[0-9]{2}s(20|21|22|23|[0-1]d):[0-5]d:[0-5]d.'
      multiline.negate: true
      multiline.match: after
  • 相关阅读:
    python 利用pyspark读取HDFS中CSV文件的指定列 列名重命名 并保存回HDFS
    python 利用pandas读取本地中CSV文件的指定列 列名重命名 并保存回本地
    CDH版本Hbase二级索引详细配置方案Solr key value index(二)中文分词
    CDH版本Hbase二级索引详细配置方案Solr key value index
    Seccon2017-pwn500-video_player
    Linux ASLR的实现
    0ctf2017-pages-choices
    33c3-pwn500-recurse
    关于C++中的string的小知识点
    Apache Kylin(三)Kylin上手
  • 原文地址:https://www.cnblogs.com/linkenpark/p/7694159.html
Copyright © 2020-2023  润新知