在Spring中整合Kafka进行消息持久化,需要遵循以下步骤:
- 添加依赖
首先,在项目的pom.xml文件中添加Kafka和Spring Kafka的依赖:
org.springframework.kafka spring-kafka 2.7.4 org.apache.kafka kafka-clients 2.7.4
- 配置Kafka
在application.yml或application.properties文件中配置Kafka相关信息:
spring: kafka: bootstrap-servers: localhost:9092 consumer: group-id: my-group auto-offset-reset: earliest key-deserializer: org.apache.kafka.common.serialization.StringDeserializer value-deserializer: org.apache.kafka.common.serialization.StringDeserializer producer: key-serializer: org.apache.kafka.common.serialization.StringSerializer value-serializer: org.apache.kafka.common.serialization.StringSerializer
- 创建Kafka配置类
创建一个Kafka配置类,用于设置Kafka的生产者和消费者模板:
@Configuration public class KafkaConfig { @Bean public MapproducerConfigs() { Map props = new HashMap<>(); props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092"); props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class); props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class); return props; } @Bean public Map consumerConfigs() { Map props = new HashMap<>(); props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092"); props.put(ConsumerConfig.GROUP_ID_CONFIG, "my-group"); props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class); props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class); props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest"); return props; } @Bean public ProducerFactory producerFactory() { return new DefaultKafkaProducerFactory<>(producerConfigs()); } @Bean public ConsumerFactory consumerFactory() { return new DefaultKafkaConsumerFactory<>(consumerConfigs()); } @Bean public KafkaTemplate kafkaTemplate() { return new KafkaTemplate<>(producerFactory()); } @Bean public ConcurrentKafkaListenerContainerFactory kafkaListenerContainerFactory() { ConcurrentKafkaListenerContainerFactory factory = new ConcurrentKafkaListenerContainerFactory<>(); factory.setConsumerFactory(consumerFactory()); return factory; } }
- 创建消息生产者
创建一个Kafka消息生产者,用于发送消息到Kafka主题:
@Service public class KafkaProducer { @Autowired private KafkaTemplatekafkaTemplate; public void sendMessage(String topic, String message) { kafkaTemplate.send(topic, message); } }
- 创建消息消费者
创建一个Kafka消息消费者,用于从Kafka主题接收消息:
@Service public class KafkaConsumer { @KafkaListener(topics = "my-topic", groupId = "my-group") public void listen(String message) { System.out.println("Received message: " + message); } }
- 发送和接收消息
现在可以创建一个Controller类,用于测试发送和接收消息的功能:
@RestController public class KafkaController { @Autowired private KafkaProducer kafkaProducer; @GetMapping("/send") public String sendMessage() { kafkaProducer.sendMessage("my-topic", "Hello, Kafka!"); return "Message sent!"; } }
启动应用程序后,访问/send
端点,将会发送一条消息到my-topic
主题。Kafka消费者将会接收到这条消息并打印出来。由于Kafka配置了持久化,所以即使应用程序重启,消息也不会丢失。