https://community.hortonworks.com/questions/27187/using-kafkabolt-to-write-to-a-kafka-topic.html
--------------------------------------------------------------------------------------------------------------
I'm building a kafka and storm based streaming application based on the usecase:
1. Application produces JSON message to a kafka topic
2. Kafka storm spout ingests the message and does processing
3. Produce output to another kafka topic using a storm bolt
Here are components and versions I'm using:
Storm: 0.10.0
Kafka_2.10: 0.9.0.1
HDP: 2.3.4
storm-kafka: 0.10.0
I accomplished steps 1 and 2 using "storm.kafka.KafkaSpout" that comes with storm-kafka
I'm trying to use "storm.kafka.KafkaBolt" to write processed data to a topic and I couldn't figure out exactly how to do it. Here is the code snippet:
Properties props = new Properties();
props.put("metadata.broker.list", "192.168.56.102:9092");
props.put("request.required.acks", "1");
props.put("serializer.class", "kafka.serializer.StringEncoder");
Config conf = new Config();
conf.put(KafkaBolt.KAFKA_BROKER_PROPERTIES, props);
KafkaBolt kafkaBolt = new KafkaBolt().withTopicSelector(new DefaultTopicSelector("OUTBOUND_TOPIC"))
.withTupleToKafkaMapper(new FieldNameBasedTupleToKafkaMapper());
builder.setBolt("kafka_outbound_bolt", kafkaBolt, 3).shuffleGrouping("process_bolt");
"process_bolt" sends out a tuple that is a serializable domain model object (SubModel.java). I would like to send SubModel.toString() data to the outbound topic. I'm running storm in local cluster mode connected to kafka in a local VM.
- Am I doing anything wrong ?
- How do I use "FieldNameBasedTupleToKafkaMapper"?