首页 > 解决方案 > Spring Cloud Stream Kafka 消费者/生产者 API 恰好一次语义(事务性)

问题描述

启用事务和动态目标的 Spring Cloud Stream Kafka 存在问题。我有两种不同的服务

如果我只是将服务作为@StreamListener 和@SendTo 运行,那么现在第二个服务会出现问题,它可以按预期正常工作。但是当我开始使用动态目的地时,我遇到了以下问题:
动态目的地

Caused by: java.lang.IllegalStateException: Cannot perform operation after producer has been closed
    at org.apache.kafka.clients.producer.KafkaProducer.throwIfProducerClosed(KafkaProducer.java:810) ~[kafka-clients-2.0.0.jar:na]
    at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:819) ~[kafka-clients-2.0.0.jar:na]
    at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:803) ~[kafka-clients-2.0.0.jar:na]
    at org.springframework.kafka.core.DefaultKafkaProducerFactory$CloseSafeProducer.send(DefaultKafkaProducerFactory.java:423) ~[spring-kafka-2.2.0.RELEASE.jar:2.2.0.RELEASE]
    at org.springframework.kafka.core.KafkaTemplate.doSend(KafkaTemplate.java:351) ~[spring-kafka-2.2.0.RELEASE.jar:2.2.0.RELEASE]
    at org.springframework.kafka.core.KafkaTemplate.send(KafkaTemplate.java:209) ~[spring-kafka-2.2.0.RELEASE.jar:2.2.0.RELEASE]
    at org.springframework.integration.kafka.outbound.KafkaProducerMessageHandler.handleRequestMessage(KafkaProducerMessageHandler.java:382) ~[spring-integration-kafka-3.1.0.RELEASE.jar:3.1.0.RELEASE]
    at org.springframework.integration.handler.AbstractReplyProducingMessageHandler.handleMessageInternal(AbstractReplyProducingMessageHandler.java:123) [spring-integration-core-5.1.0.RELEASE.jar:5.1.0.RELEASE]
    at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:169) [spring-integration-core-5.1.0.RELEASE.jar:5.1.0.RELEASE]

尝试了动态目标解析器的两种方法: spring cloud kafka的动态目标解析器

yml:

spring: 
  cloud.stream:
      bindings:
        input:
          destination: test_input
          content-type: application/json
          group: test_group
        output:
          destination: test_output
          content-type: application/json
      kafka.binder: 
          configuration: 
            isolation.level: read_committed
            security.protocol: SASL_SSL
            sasl.mechanism: GSSAPI
            sasl.kerberos.service.name: kafka
            ssl.truststore.location: jks
            ssl.truststore.password: 
            ssl.endpoint.identification.algorithm: null            
          brokers: broker1:9092,broker2:9092,broker3:9092
          auto-create-topics: false
          transaction:
            transaction-id-prefix: trans-2
            producer:
              configuration:
                retries: 2000
                acks: all
                security.protocol: SASL_SSL
                sasl.mechanism: GSSAPI
                sasl.kerberos.service.name: kafka
                ssl.truststore.location: jks
                ssl.truststore.password: 
                ssl.endpoint.identification.algorithm: null

这是这个问题的背景
Spring Cloud Stream for Kafka with consumer/producer API just once 语义与 transaction-id-prefix 没有按预期工作

更新了代码:


    @Autowired
    private BinderAwareChannelResolver resolver;

    @StreamListener(target = Processor.INPUT)
    public void consumer(@Payload Object inMessage, @Headers Map headers) {
        String topicName = null;
        String itemType = null;
        try {
            TransactionSynchronizationManager.setActualTransactionActive(true);     
            itemType = msgService.itemTypeExtract((String) inMessage);          
            topicName = msgService.getTopicName(itemType, (String) inMessage);      

            Map<String, Object> headersMap = new HashMap<>();
            headersMap.put(MessageHeaders.CONTENT_TYPE, MediaType.APPLICATION_JSON_VALUE);

            resolver.resolveDestination("destination_topic")
                    .send(MessageBuilder.createMessage(inMessage, new MessageHeaders(headersMap)), 10000);
        } catch (Exception e) {
            LOGGER.error("error " + e.getMessage());
        }
    } 

标签: kafka-consumer-apikafka-producer-apispring-cloud-stream

解决方案


活页夹中有一个错误;我打开了一个问题来修复它


推荐阅读