• Flume采集Nginx日志到HDFS


    下载apache-flume-1.7.0-bin.tar.gz,用

    tar -zxvf

    解压,在/etc/profile文件中增加设置:

    export FLUME_HOME=/opt/apache-flume-1.7.0-bin
    export PATH=$PATH:$FLUME_HOME/bin

    修改$FLUME_HOME/conf/下的两个文件,在flume-env.sh中增加JAVA_HOME:

    JAVA_HOME=/opt/jdk1.8.0_121

    最重要的,修改flume-conf.properties文件:

    # 配置Agent
    a1.sources = r1
    a1.sinks = k1
    a1.channels = c1
    
    # 配置Source
    a1.sources.r1.type = exec
    a1.sources.r1.channels = c1
    a1.sources.r1.deserializer.outputCharset = UTF-8
    
    # 配置需要监控的日志输出目录
    a1.sources.r1.command = tail -F /usr/local/nginx/log/access.log
    
    # 配置Sink
    a1.sinks.k1.type = hdfs
    a1.sinks.k1.channel = c1
    a1.sinks.k1.hdfs.useLocalTimeStamp = true
    a1.sinks.k1.hdfs.path = hdfs://master:9000/flume/events/%Y-%m
    a1.sinks.k1.hdfs.filePrefix = %Y-%m-%d-%H
    a1.sinks.k1.hdfs.fileSuffix = .log
    a1.sinks.k1.hdfs.minBlockReplicas = 1
    a1.sinks.k1.hdfs.fileType = DataStream
    a1.sinks.k1.hdfs.writeFormat = Text
    a1.sinks.k1.hdfs.rollInterval = 86400
    a1.sinks.k1.hdfs.rollSize = 1000000
    a1.sinks.k1.hdfs.rollCount = 10000
    a1.sinks.k1.hdfs.idleTimeout = 0
    
    # 配置Channel
    a1.channels.c1.type = memory
    a1.channels.c1.capacity = 1000
    a1.channels.c1.transactionCapacity = 100
    
    # 将三者连接
    a1.sources.r1.channel = c1
    a1.sinks.k1.channel = c1

    以上文件设置了Source、Channel和Sink,将Nginx日志中的记录采集到HDFS,运行

    flume-ng agent -n a1 -c conf -f $FLUME_HOME/conf/flume-conf.properties

    如果没有报错,则安装设置成功了,Nginx中新增加的记录都会被Flume采集,并且存储到HDFS。

  • 相关阅读:
    UnitTest测试套件及运行器
    DDT实现数据驱动
    MySQL练习题部分答案(未完待续)
    day58自我回顾版
    Linux 下安装pip
    wget用法汇总
    Linux基础操作整理
    pip安装django失败
    利用"SQL"语句自动生成序号的两种方式
    Python2.*与Python3.*共存问题
  • 原文地址:https://www.cnblogs.com/mstk/p/6980212.html
Copyright © 2020-2023  润新知