首页 > 解决方案 > 同步 Kafka 中的中间主题:使用 Spring Request-Reply

问题描述

在最新版本的 spring-kakfa 中,我们正在尝试使用请求-回复语义,并想知道我们是否可以在不丢失相关 ID 的情况下使用中间主题。我们的一个用例是从 api 接收消息,将其生成到 topic1,并将结果发送到 topic2,并在 topic2 上处理消息并将其发送到 topic3,topic3 将最终响应发送到来自 topic1 的初始请求。

我无法将来自 topic3 的响应与 topic1 上的请求相关联,因为相关 ID 在中间主题中丢失。如果我不使用中间主题(比如 topic2),我能够接收到消息,然后 topic1 发送一条带有相关 ID 的消息,并从 topic3 接收其相应的响应。

任何建议/建议都非常有帮助。

下面是示例代码:从我的 API 我发布交易

public String postTransaction(String request,Map<String, String> headers) throws InterruptedException, ExecutionException {
    ProducerRecord<String,String> record=new ProducerRecord<String,String>(topic1,"300",request);
    record.headers().add(new RecordHeader(KafkaHeaders.REPLY_TOPIC,topic3.getBytes()));
    RequestReplyFuture<String,String,String> sendAndReceive=kafkaTemplate.sendAndReceive(record);
    SendResult<String,String> requestMessage=sendAndReceive.getSendFuture().get();
    return sendAndReceive.get().value();
}
#

在另一个消费者中,我正在收听 topic1 并获取相关 ID 并在 topic2 上生成消息,该消息将向 topic3 发送回复。

public void listen(Object request, @Header(KafkaHeaders.RECEIVED_MESSAGE_KEY) List<Integer> keys,
                   @Header(KafkaHeaders.RECEIVED_PARTITION_ID) List<Integer> partitions,
                   @Header(KafkaHeaders.RECEIVED_TOPIC) List<String> topics,
                   @Header(KafkaHeaders.OFFSET) List<Long> offsets,
                   @Header(KafkaHeaders.CORRELATION_ID) byte[] coRlId) throws InterruptedException {

    ProducerRecord<String,String> record=new ProducerRecord<String,String>("topic2","300",k.value());
    record.headers().add(new RecordHeader(KafkaHeaders.CORRELATION_ID,coRlId.get(0).getBytes()));

    kafkaTemplate.send(record);

}

标签: apache-kafkaspring-kafka

解决方案


我刚刚测试了它,它对我来说很好用......

@SpringBootApplication
public class So61152047Application {

    public static void main(String[] args) {
        SpringApplication.run(So61152047Application.class, args);
    }

    @Autowired
    private KafkaTemplate<String, String> kafkaTemplate;

    @Bean
    public ReplyingKafkaTemplate<String, String, String> replyer(ProducerFactory<String, String> pf,
            ConcurrentKafkaListenerContainerFactory<String, String> containerFactory) {

        containerFactory.setReplyTemplate(kafkaTemplate(pf));
        ConcurrentMessageListenerContainer<String, String> container = replyContainer(containerFactory);
        ReplyingKafkaTemplate<String, String, String> replyer = new ReplyingKafkaTemplate<>(pf, container);
        return replyer;
    }

    @Bean
    public ConcurrentMessageListenerContainer<String, String> replyContainer(
            ConcurrentKafkaListenerContainerFactory<String, String> containerFactory) {

        ConcurrentMessageListenerContainer<String, String> container = containerFactory.createContainer("topic3");
        container.getContainerProperties().setGroupId("three");
        return container;
    }

    @Bean
    public KafkaTemplate<String, String> kafkaTemplate(ProducerFactory<String, String> pf) {
        return new KafkaTemplate<>(pf);
    }


    @Bean
    public ApplicationRunner runner(ReplyingKafkaTemplate<String, String, String> rkt) {
        return args -> {
            ProducerRecord<String, String> pr = new ProducerRecord<>("topic1", "foo", "bar");
            RequestReplyFuture<String, String, String> future = rkt.sendAndReceive(pr);
            System.out.println(future.get(10, TimeUnit.SECONDS).value());
        };
    }

    @Bean
    public NewTopic topic1() {
        return TopicBuilder.name("topic1").partitions(1).replicas(1).build();
    }

    @Bean
    public NewTopic topic2() {
        return TopicBuilder.name("topic2").partitions(1).replicas(1).build();
    }

    @Bean
    public NewTopic topic3() {
        return TopicBuilder.name("topic3").partitions(1).replicas(1).build();
    }

    @KafkaListener(id = "one", topics = "topic1")
    public void listen1(String in,
            @Header(KafkaHeaders.CORRELATION_ID) byte[] corrId) {

        System.out.println(in);
        ProducerRecord<String, String> pr = new ProducerRecord<>("topic2", in.toUpperCase());
        pr.headers().add(new RecordHeader(KafkaHeaders.CORRELATION_ID, corrId));
        this.kafkaTemplate.send(pr);
    }

    @KafkaListener(id = "two", topics = "topic2")
    @SendTo("topic3")
    public String listen2(String in) {
        return in + in;
    }

}

bar
BARBAR

您还可以传播回复标头...

    @KafkaListener(id = "one", topics = "topic1")
    public void listen1(String in,
            @Header(KafkaHeaders.REPLY_TOPIC) byte[] replyTo,
            @Header(KafkaHeaders.CORRELATION_ID) byte[] corrId) {

        System.out.println(in);
        ProducerRecord<String, String> pr = new ProducerRecord<>("topic2", in.toUpperCase());
        pr.headers().add(new RecordHeader(KafkaHeaders.REPLY_TOPIC, replyTo));
        pr.headers().add(new RecordHeader(KafkaHeaders.CORRELATION_ID, corrId));
        this.kafkaTemplate.send(pr);
    }

    @KafkaListener(id = "two", topics = "topic2")
    @SendTo // ("topic3")
    public String listen2(String in) {
        return in + in;
    }

编辑

要通过有效负载传达相关 ID:

public class CorrelatingProducerInterceptor implements ProducerInterceptor<String, Foo> {

    @Override
    public void configure(Map<String, ?> configs) {
    }

    @Override
    public ProducerRecord<String, Foo> onSend(ProducerRecord<String, Foo> record) {
        Header correlation = record.headers().lastHeader(KafkaHeaders.CORRELATION_ID);
        if (correlation != null) {
            record.value().setCorrelation(correlation.value());
        }
        return record;
    }

    @Override
    public void onAcknowledgement(RecordMetadata metadata, Exception exception) {
    }

    @Override
    public void close() {
    }

}
@SpringBootApplication
public class So61152047Application {

    public static void main(String[] args) {
        SpringApplication.run(So61152047Application.class, args);
    }

    @Autowired
    private KafkaTemplate<String, Foo> kafkaTemplate;

    @Bean
    public ReplyingKafkaTemplate<String, Foo, Foo> replyer(ProducerFactory<String, Foo> pf,
            ConcurrentKafkaListenerContainerFactory<String, Foo> containerFactory) {

        containerFactory.setReplyTemplate(kafkaTemplate(pf));
        ConcurrentMessageListenerContainer<String, Foo> container = replyContainer(containerFactory);
        ReplyingKafkaTemplate<String, Foo, Foo> replyer = new ReplyingKafkaTemplate<>(pf, container);
        return replyer;
    }

    @Bean
    public ConcurrentMessageListenerContainer<String, Foo> replyContainer(
            ConcurrentKafkaListenerContainerFactory<String, Foo> containerFactory) {

        ConcurrentMessageListenerContainer<String, Foo> container = containerFactory.createContainer("topic3");
        container.getContainerProperties().setGroupId("three");
        return container;
    }

    @Bean
    public KafkaTemplate<String, Foo> kafkaTemplate(ProducerFactory<String, Foo> pf) {
        return new KafkaTemplate<>(pf);
    }

    @Bean
    public ApplicationRunner runner(ReplyingKafkaTemplate<String, Foo, Foo> rkt) {
        return args -> {
            ProducerRecord<String, Foo> pr = new ProducerRecord<>("topic1", "foo", new Foo("bar"));
            RequestReplyFuture<String, Foo, Foo> future = rkt.sendAndReceive(pr);
            System.out.println(future.get(10, TimeUnit.SECONDS).value());
        };
    }

    @Bean
    public NewTopic topic1() {
        return TopicBuilder.name("topic1").partitions(1).replicas(1).build();
    }

    @Bean
    public NewTopic topic2() {
        return TopicBuilder.name("topic2").partitions(1).replicas(1).build();
    }

    @Bean
    public NewTopic topic3() {
        return TopicBuilder.name("topic3").partitions(1).replicas(1).build();
    }

    @KafkaListener(id = "one", topics = "topic1")
    public void listen1(Foo in) {

        System.out.println(in);
        in.setContent(in.getContent().toUpperCase());
        ProducerRecord<String, Foo> pr = new ProducerRecord<>("topic2", in);
        this.kafkaTemplate.send(pr);
    }

    @KafkaListener(id = "two", topics = "topic2")
    public void listen2(Foo in) {
        ProducerRecord<String, Foo> pr = new ProducerRecord<>("topic3", new Foo(in.getContent() + in.getContent()));
        pr.headers().add(new RecordHeader(KafkaHeaders.CORRELATION_ID, in.getCorrelation()));
        this.kafkaTemplate.send(pr);
    }

}

class Foo {

    String content;

    byte[] correlation;

    public Foo() {
    }

    public Foo(String content) {
        this.content = content;
    }

    public String getContent() {
        return this.content;
    }

    public void setContent(String content) {
        this.content = content;
    }

    public byte[] getCorrelation() {
        return this.correlation;
    }

    public void setCorrelation(byte[] correlation) {
        this.correlation = correlation;
    }

    @Override
    public String toString() {
        return "Foo [content=" + this.content + "]";
    }

}
spring.kafka.consumer.auto-offset-reset=earliest
spring.kafka.consumer.value-deserializer=org.springframework.kafka.support.serializer.JsonDeserializer
spring.kafka.consumer.properties.spring.json.value.default.type=com.example.demo.Foo

spring.kafka.producer.value-serializer=org.springframework.kafka.support.serializer.JsonSerializer
spring.kafka.producer.properties.interceptor.classes=com.example.demo.CorrelatingProducerInterceptor

Foo [content=bar]
Foo [content=BARBAR]

当然,中间应用程序需要传递关联 id,即使它在有效负载中。


推荐阅读