• logstash Codec插件


    Codec: 解码编码 数据格式
    
    json,msgpack,edn
    
    
    logstash处理流程:
    
    input->decode->filter->encode->output
    
    
    plain 是一个空的解析器,它可以让用户自己制定格式
    
    [elk@db01 0204]$ cat plain01.conf 
    input {
     stdin {
       }
    }
    
    output {
     stdout{
      codec=>plain
    }
    }
    
    
    
    [elk@db01 0204]$ logstash -f plain01.conf 
    Settings: Default pipeline workers: 4
    Pipeline main started
    333333
    2017-01-17T21:16:27.548Z db01 33333344444
    2017-01-17T21:16:34.774Z db01 44444
    
    
    [elk@db01 0204]$ cat plain02.conf 
    input {
     stdin {
       }
    }
    
    output {
     stdout{
    codec=>json
    }
    }
    
    
    
    [elk@db01 0204]$ logstash -f plain02.conf 
    Settings: Default pipeline workers: 4
    Pipeline main started
    aaaa
    {"message":"aaaa","@version":"1","@timestamp":"2017-01-17T21:18:22.160Z","host":"db01"}
    
    
    json编码:
    
    如果事件数据是json格式,可以加入codec=>json来进行解析
    
    [elk@db01 0204]$ logstash -f plain02.conf 
    Settings: Default pipeline workers: 4
    Pipeline main started
    aaaa
    {"message":"aaaa","@version":"1","@timestamp":"2017-01-17T21:18:22.160Z","host":"db01"}
    
    
    
    json_lines 编码:
    
    input {
      tcp{
          port=>12388
          host=>"127.0.0.1"
          codec=>json_lines{
       }
     }
    }
    
    output{
      stdout{}
    }
    
    
    
    rubydebug 
    
    采用Ruby库来解析日志
    
    [elk@db01 0204]$ cat ruby.conf 
    input {
      stdin {
      codec=>json
    }
    }
    
    output {
     stdout{
     codec=>rubydebug
     }
    }
    
    [elk@db01 0204]$ logstash -f ruby.conf 
    Settings: Default pipeline workers: 4
    Pipeline main started
    {"bookname":"elk","price":12}  
    {
          "bookname" => "elk",
             "price" => 12,
          "@version" => "1",
        "@timestamp" => "2017-01-17T21:40:28.601Z",
              "host" => "db01"
    }
    
    
    
    
    
    multiline 多行事件
    
    有时候有的日志用多行去展现,这么多行其实都是一个事件
    
    比如JAVA的异常日志
    
    
    
    what=>"previous" 未匹配的内容向前合并
    [elk@db01 0204]$ cat mulit.conf 
    input {
     stdin {
     codec=>multiline {
     pattern=>"^["
     negate=>true
     what=>"previous"
     }
    }
    }
    
    output {
     stdout{}
    }
    
    
    [elk@db01 0204]$ logstash -f mulit.conf 
    Settings: Default pipeline workers: 4
    Pipeline main started
    [03-Jun-2014 13:34:13:] PHP err01:aaaaaaaaa
    111111111111111
    222222222222222
    [09-Aug-2015 44:33:22] PHP 9999
    2017-01-17T21:59:39.654Z db01 [03-Jun-2014 13:34:13:] PHP err01:aaaaaaaaa
    111111111111111
    222222222222222
    
    
    为什么[09-Aug-2015 44:33:22] PHP 9999 这条没输出,因为需要匹配下一个 pattern=>"^["
    
    
    
    
    

  • 相关阅读:
    (10)进程---Manager数据共享
    (9)进程---JoinableQueue队列
    (8)进程---Queue队列
    (7)Pool进程池
    (6)进程---Event事件
    (5)进程--锁和信号量
    (4)进程---daemon守护进程和join阻塞
    XSLT知识点【一】
    XSL-FO知识点【一】
    XPath知识点【一】
  • 原文地址:https://www.cnblogs.com/hzcya1995/p/13349892.html
Copyright © 2020-2023  润新知