要将Java Kafka与Apache Storm集成,您需要遵循以下步骤:
- 添加依赖项
首先,确保在您的项目中添加了Kafka和Storm的依赖项。对于Maven项目,您可以在pom.xml文件中添加以下依赖项:
org.apache.kafka kafka-clients 2.8.0 org.apache.storm storm-core 2.3.2
- 创建Kafka生产者
创建一个Java类,用于向Kafka主题发送消息。例如,创建一个名为KafkaProducer.java
的文件:
import org.apache.kafka.clients.producer.KafkaProducer; import org.apache.kafka.clients.producer.ProducerRecord; import java.util.Properties; public class KafkaProducer { public static void main(String[] args) { Properties props = new Properties(); props.put("bootstrap.servers", "localhost:9092"); props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer"); props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer"); KafkaProducerproducer = new KafkaProducer<>(props); for (int i = 0; i < 100; i++) { producer.send(new ProducerRecord<>("my-topic", Integer.toString(i), Integer.toString(i * 2))); } producer.close(); } }
- 创建Storm Topology
创建一个Java类,用于定义Storm Topology。例如,创建一个名为KafkaSpout.java
的文件:
import org.apache.storm.topology.TopologyBuilder; import org.apache.storm.StormSubmitter; public class KafkaSpout { public static void main(String[] args) throws Exception { TopologyBuilder builder = new TopologyBuilder(); builder.setSpout("kafka-spout", new KafkaSpout(), 5); builder.setBolt("bolt", new KafkaBolt(), 5).shuffleGrouping("kafka-spout"); Config config = new Config(); config.setNumWorkers(3); StormSubmitter.submitTopology("kafka-topology", config, builder.createTopology()); } }
- 创建Kafka Spout
创建一个Java类,用于从Kafka主题读取消息。例如,创建一个名为KafkaSpout.java
的文件:
import org.apache.storm.kafka.spout.KafkaSpoutConfig; import org.apache.storm.topology.TopologyBuilder; public class KafkaSpout { public static void main(String[] args) throws Exception { TopologyBuilder builder = new TopologyBuilder(); builder.setSpout("kafka-spout", new KafkaSpoutConfig .Builder("localhost:9092", "my-topic") .setProp("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer") .setProp("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer") .build(), 5); builder.setBolt("bolt", new KafkaBolt(), 5).shuffleGrouping("kafka-spout"); Config config = new Config(); config.setNumWorkers(3); StormSubmitter.submitTopology("kafka-topology", config, builder.createTopology()); } }
- 创建Kafka Bolt
创建一个Java类,用于处理从Kafka Spout接收到的消息。例如,创建一个名为KafkaBolt.java
的文件:
import org.apache.storm.topology.BasicOutputCollector; import org.apache.storm.topology.OutputFieldsDeclarer; import org.apache.storm.topology.base.BaseBasicBolt; import org.apache.storm.tuple.Tuple; public class KafkaBolt extends BaseBasicBolt { @Override public void execute(Tuple input, BasicOutputCollector collector) { String message = input.getStringByField("value"); System.out.println("Received message: " + message); } @Override public void declareOutputFields(OutputFieldsDeclarer declarer) { } }
现在,您已经成功地将Java Kafka与Apache Storm集成在一起。运行KafkaProducer.java
以发送消息到Kafka主题,然后运行KafkaSpout.java
以从Kafka主题读取消息并将其传递给KafkaBolt.java
进行处理。