• golang kafka client


    针对golang的 kafka client 有很多开源package,例如sarama, confluent等等。在使用sarama 包时,高并发中偶尔遇到crash。于是改用confluent-kafka-go,其简单易用,并且表现稳定。

    本文主要介绍confluent-kafka-go的使用方法。
    confluent-kafka-go,是kafka官网推荐的golang package。

    confluent-kafka-go is Confluent's Golang client for Apache Kafka and the Confluent Platform.

    编译环境搭建

    安装librdkafka

    下载

    $ git clone https://github.com/edenhill/librdkafka.git
    $ cd librdkafka
    

    配置、编译、安装

    $ ./configure --prefix /usr
    $ make
    $ sudo make install
    

    配置PKG_CONFIG_PATH

    在文件~/.bashrc 末尾添加

    export PKG_CONFIG_PATH=/usr/lib/pkgconfig

    下载go client

    $ go get -u github.com/confluentinc/confluent-kafka-go/kafka
    

    自动下载到GOPATH目录下,也可到github上自行下载,然后放到GOPATH中。

    Example

    // Example function-based Apache Kafka producer
    package main
    
    /**
     * Copyright 2016 Confluent Inc.
     *
     * Licensed under the Apache License, Version 2.0 (the "License");
     * you may not use this file except in compliance with the License.
     * You may obtain a copy of the License at
     *
     * http://www.apache.org/licenses/LICENSE-2.0
     *
     * Unless required by applicable law or agreed to in writing, software
     * distributed under the License is distributed on an "AS IS" BASIS,
     * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
     * See the License for the specific language governing permissions and
     * limitations under the License.
     */
    
    import (
    	"fmt"
    	"github.com/confluentinc/confluent-kafka-go/kafka"
    	"os"
    )
    
    func main() {
    
    	if len(os.Args) != 3 {
    		fmt.Fprintf(os.Stderr, "Usage: %s <broker> <topic>
    ",
    			os.Args[0])
    		os.Exit(1)
    	}
    
    	broker := os.Args[1]
    	topic := os.Args[2]
    
    	p, err := kafka.NewProducer(&kafka.ConfigMap{"bootstrap.servers": broker})
    
    	if err != nil {
    		fmt.Printf("Failed to create producer: %s
    ", err)
    		os.Exit(1)
    	}
    
    	fmt.Printf("Created Producer %v
    ", p)
    
    	// Optional delivery channel, if not specified the Producer object's
    	// .Events channel is used.
    	deliveryChan := make(chan kafka.Event)
    
    	value := "Hello Go!"
    	err = p.Produce(&kafka.Message{TopicPartition: kafka.TopicPartition{Topic: &topic, Partition: kafka.PartitionAny}, Value: []byte(value)}, deliveryChan)
    
    	e := <-deliveryChan
    	m := e.(*kafka.Message)
    
    	if m.TopicPartition.Error != nil {
    		fmt.Printf("Delivery failed: %v
    ", m.TopicPartition.Error)
    	} else {
    		fmt.Printf("Delivered message to topic %s [%d] at offset %v
    ",
    			*m.TopicPartition.Topic, m.TopicPartition.Partition, m.TopicPartition.Offset)
    	}
    
    	close(deliveryChan)
    }
    

    注意:
    如果需要链接静态库,可删除/usr/lib/下面关于rdkafka的动态库文件(.so文件)。然后,go build编译时加上选项 –tags static
    例如:

    go build -tags static produer.go

    更多example,可参考
    https://github.com/confluentinc/confluent-kafka-go/tree/master/examples

    参考

    https://github.com/confluentinc/confluent-kafka-go

  • 相关阅读:
    13 数据库主从
    12 数据备份
    11 锁机制
    12 日志
    10 索引(二)
    09 索引
    update kernel 3.10-3.12
    haproxy para config
    mysql slave to master
    storage disk
  • 原文地址:https://www.cnblogs.com/lanyangsh/p/7782771.html
Copyright © 2020-2023  润新知