apache-kafka - How to make kafka connect to pick up messages from last failure
问题描述
I am using Elastic search Kafka connect in stand alone mode . I am not confused in which configuration to use in order to start Kafka connect and pick up from the last failure point .
For example producer will keep on pushing records into Kafka and Consumer as Elastic Search sink connector is consuming ,Now my for some reason my consumer is down but still my prouder will keep on pushing messages into Kafka . Now when i have fixed my issue in ES sink connector side and if i restart ES sink connector it should pick from last failure not from beginning or latest but from last failure . If 10 messages has come since failure then when ES sink connector starts it should first sink 10 and then latest .
Please help with configuration .
解决方案
Kafka Connect 充当任何其他 Kafka 消费者。当您重新启动时,消费者将回退到最后提交的偏移位置。这是默认行为,在 Connect 中没有设置可以更改它。
推荐阅读
- html - 我如何使用内联 css 居中 2 块按钮?
- flutter - 如何从 firestore 中的特定文档中获取数组,并用颤振中的数据填充我的本地列表?
- generics - 泛型类中的泛型变量返回空
- validation - 用于 FacesMessages 的 JSF 2.2 复合组件在托管 Bean 验证之前执行
- css - 在商店框架中居中 CSS 模态窗口
- python - Discord Bot,赋予新成员角色
- c# - 监听 Firestore 数据库时如何保持控制台应用程序开启
- swift - 如何为不属于 tableView 但位于其下方的活动指示器设置约束
- javascript - 检查css类是否存在于reactjs中的本地module.css文件中
- c++ - std::multimap equal _range 和 C++20 std::views::values 不能很好地协同工作