design-patterns - Consumer Producer design
问题描述
I am looking for advice on how the following problem is handled in solutions that use Pulsar/kafka.
The scenario is: A producer is sending messages (in JSON) and the consumer is taking them and inserting data into database tables (specified in the message).
Suddenly the producer changes the structure of the data sent in the messages (let's say because the table structure in the database has a new column).
So for a moment, the queue has messages with the old data structure and now starts receiving messages with new data structure.
My doubt is regarding how the consumer should handle this scenario. What to do with the messages with old structure that are now invalid since they cannot be inserted in the database table since the table structure changed. Retry and then permanently fail (dead letter Q?).
Also, do you usually opt to sent the metadata along with your messages or do you normally handle this in a separate topic or other form.
Thanks for any advice
解决方案
所描述的问题主要与外部系统有关,这将需要在生产者端进行某种看门/预验证,以了解如何使用数据来防止这种情况发生。不幸的是,这引入了紧密耦合,因此如果没有它,您必须显式编写消费者代码以进行强大的消息转换和异常处理,可能包括一种版本号或显式模式,如 Confluent Schema Registry 提供的每条消息(也可能是 Pulsar 的 Schema Registry 功能)
推荐阅读
- kotlin - 在空引用上调用扩展函数时没有 NullPointerException
- c# - 在 C# 中将字符添加到字符串中
- sql - SQL (SAS) 中的条件求和(SUMIFS 等效项)- 第 2 部分
- javascript - React - 实时搜索过滤器和搜索按钮
- java - 为客户端发送的每个文件(套接字)创建新线程
- python - 当我尝试将值插入 MySql 表时,我不断收到语法错误
- reactjs - 渲染发生在使用 useEffect 的请求之前
- python - Python Discord Bot 如何在一条消息中发送多行?
- php - 如何将数据库中的每个单元格存储在数组中?
- python - 我无法让我的 python 代码与 crontab 一起使用