首页 > 解决方案 > Kafka 和 NonTrustedHeaders 上的 SCDF 中的消息转换

问题描述

我很难弄清楚如何让一个简单的 SCDF 管道正常工作。

我正在使用本地设置:

{
  "versionInfo": {
    "implementation": {
      "name": "spring-cloud-dataflow-server-local",
      "version": "1.6.0.BUILD-SNAPSHOT"
    },
    "core": {
      "name": "Spring Cloud Data Flow Core",
      "version": "1.6.0.BUILD-SNAPSHOT"
    },
    "dashboard": {
      "name": "Spring Cloud Dataflow UI",
      "version": "1.6.0.M1"
    },
    "shell": {
      "name": "Spring Cloud Data Flow Shell",
      "version": "1.6.0.BUILD-SNAPSHOT",
      "url": "https://repo.spring.io/libs-snapshot/org/springframework/cloud/spring-cloud-dataflow-shell/1.6.0.BUILD-SNAPSHOT/spring-cloud-dataflow-shell-1.6.0.BUILD-SNAPSHOT.jar"
    }
  },
  "featureInfo": {
    "analyticsEnabled": true,
    "streamsEnabled": true,
    "tasksEnabled": true,
    "skipperEnabled": false
  },
  "securityInfo": {
    "isAuthenticationEnabled": false,
    "isAuthorizationEnabled": false,
    "isFormLogin": false,
    "isAuthenticated": false,
    "username": null,
    "roles": []
  },
  "runtimeEnvironment": {
    "appDeployer": {
      "platformSpecificInfo": {},
      "deployerImplementationVersion": "1.3.7.RELEASE",
      "deployerName": "LocalAppDeployer",
      "deployerSpiVersion": "1.3.2.RELEASE",
      "javaVersion": "1.8.0_45",
      "platformApiVersion": "Mac OS X 10.13.4",
      "platformClientVersion": "10.13.4",
      "platformHostVersion": "10.13.4",
      "platformType": "Local",
      "springBootVersion": "1.5.14.RELEASE",
      "springVersion": "4.3.18.RELEASE"
    },
    "taskLauncher": {
      "platformSpecificInfo": {},
      "deployerImplementationVersion": "1.3.7.RELEASE",
      "deployerName": "LocalTaskLauncher",
      "deployerSpiVersion": "1.3.2.RELEASE",
      "javaVersion": "1.8.0_45",
      "platformApiVersion": "Mac OS X 10.13.4",
      "platformClientVersion": "10.13.4",
      "platformHostVersion": "10.13.4",
      "platformType": "Local",
      "springBootVersion": "1.5.14.RELEASE",
      "springVersion": "4.3.18.RELEASE"
    }
  }
}

管道非常简单:

http --port=9191 | transform --expression=payload.toUpperCase() | log

当我像这样使用 cURL 触发 http 端点时:

curl -v -H"Referer: http://localhost:8080" -H"Content-Type: text/plain" -XPOST localhost:9191/ -d 'test'

transform我在处理器的日志文件中看到以下错误消息:

2018-07-11 09:56:59.758 ERROR 66396 --- [container-0-C-1] o.s.kafka.listener.LoggingErrorHandler   : Error while processing: ConsumerRecord(topic = edded.http, partition = 0, offset = 0, CreateTime = 1531295816669, serialized key size = -1, serialized value size = 17, headers = RecordHeaders(headers = [RecordHeader(key = referer, value = [34, 104, 116, 116, 112, 58, 47, 47, 108, 111, 99, 97, 108, 104, 111, 115, 116, 58, 56, 48, 56, 48, 34]), RecordHeader(key = content-length, value = [49, 55]), RecordHeader(key = http_requestMethod, value = [34, 80, 79, 83, 84, 34]), RecordHeader(key = host, value = [34, 108, 111, 99, 97, 108, 104, 111, 115, 116, 58, 57, 49, 57, 49, 34]), RecordHeader(key = http_requestUrl, value = [34, 104, 116, 116, 112, 58, 47, 47, 108, 111, 99, 97, 108, 104, 111, 115, 116, 58, 57, 49, 57, 49, 47, 34]), RecordHeader(key = contentType, value = [123, 34, 116, 121, 112, 101, 34, 58, 34, 116, 101, 120, 116, 34, 44, 34, 115, 117, 98, 116, 121, 112, 101, 34, 58, 34, 112, 108, 97, 105, 110, 34, 44, 34, 112, 97, 114, 97, 109, 101, 116, 101, 114, 115, 34, 58, 123, 34, 99, 104, 97, 114, 115, 101, 116, 34, 58, 34, 85, 84, 70, 45, 56, 34, 125, 44, 34, 113, 117, 97, 108, 105, 116, 121, 86, 97, 108, 117, 101, 34, 58, 49, 46, 48, 44, 34, 99, 104, 97, 114, 115, 101, 116, 34, 58, 34, 85, 84, 70, 45, 56, 34, 44, 34, 119, 105, 108, 100, 99, 97, 114, 100, 84, 121, 112, 101, 34, 58, 102, 97, 108, 115, 101, 44, 34, 119, 105, 108, 100, 99, 97, 114, 100, 83, 117, 98, 116, 121, 112, 101, 34, 58, 102, 97, 108, 115, 101, 44, 34, 99, 111, 110, 99, 114, 101, 116, 101, 34, 58, 116, 114, 117, 101, 125]), RecordHeader(key = user-agent, value = [34, 77, 111, 122, 105, 108, 108, 97, 47, 53, 46, 48, 32, 40, 99, 111, 109, 112, 97, 116, 105, 98, 108, 101, 59, 32, 77, 83, 73, 69, 32, 57, 46, 48, 59, 32, 87, 105, 110, 100, 111, 119, 115, 32, 78, 84, 32, 54, 46, 49, 59, 32, 84, 114, 105, 100, 101, 110, 116, 47, 53, 46, 48, 41, 34]), RecordHeader(key = accept, value = [123, 34, 116, 121, 112, 101, 34, 58, 34, 42, 34, 44, 34, 115, 117, 98, 116, 121, 112, 101, 34, 58, 34, 42, 34, 44, 34, 112, 97, 114, 97, 109, 101, 116, 101, 114, 115, 34, 58, 123, 125, 44, 34, 113, 117, 97, 108, 105, 116, 121, 86, 97, 108, 117, 101, 34, 58, 49, 46, 48, 44, 34, 99, 104, 97, 114, 115, 101, 116, 34, 58, 110, 117, 108, 108, 44, 34, 119, 105, 108, 100, 99, 97, 114, 100, 84, 121, 112, 101, 34, 58, 116, 114, 117, 101, 44, 34, 119, 105, 108, 100, 99, 97, 114, 100, 83, 117, 98, 116, 121, 112, 101, 34, 58, 116, 114, 117, 101, 44, 34, 99, 111, 110, 99, 114, 101, 116, 101, 34, 58, 102, 97, 108, 115, 101, 125]), RecordHeader(key = spring_json_header_types, value = [123, 34, 114, 101, 102, 101, 114, 101, 114, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 83, 116, 114, 105, 110, 103, 34, 44, 34, 99, 111, 110, 116, 101, 110, 116, 45, 108, 101, 110, 103, 116, 104, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 76, 111, 110, 103, 34, 44, 34, 104, 116, 116, 112, 95, 114, 101, 113, 117, 101, 115, 116, 77, 101, 116, 104, 111, 100, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 83, 116, 114, 105, 110, 103, 34, 44, 34, 104, 111, 115, 116, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 83, 116, 114, 105, 110, 103, 34, 44, 34, 104, 116, 116, 112, 95, 114, 101, 113, 117, 101, 115, 116, 85, 114, 108, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 83, 116, 114, 105, 110, 103, 34, 44, 34, 99, 111, 110, 116, 101, 110, 116, 84, 121, 112, 101, 34, 58, 34, 111, 114, 103, 46, 115, 112, 114, 105, 110, 103, 102, 114, 97, 109, 101, 119, 111, 114, 107, 46, 104, 116, 116, 112, 46, 77, 101, 100, 105, 97, 84, 121, 112, 101, 34, 44, 34, 117, 115, 101, 114, 45, 97, 103, 101, 110, 116, 34, 58, 34, 106, 97, 118, 97, 46, 108, 97, 110, 103, 46, 83, 116, 114, 105, 110, 103, 34, 44, 34, 97, 99, 99, 101, 112, 116, 34, 58, 34, 111, 114, 103, 46, 115, 112, 114, 105, 110, 103, 102, 114, 97, 109, 101, 119, 111, 114, 107, 46, 104, 116, 116, 112, 46, 77, 101, 100, 105, 97, 84, 121, 112, 101, 34, 125])], isReadOnly = false), key = null, value = [B@4bc28689)

org.springframework.messaging.MessageHandlingException: nested exception is org.springframework.expression.spel.SpelEvaluationException: EL1004E: Method call: Method toUpperCase() cannot be found on type byte[]
    at org.springframework.integration.handler.MethodInvokingMessageProcessor.processMessage(MethodInvokingMessageProcessor.java:107) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
    at org.springframework.integration.handler.ServiceActivatingHandler.handleRequestMessage(ServiceActivatingHandler.java:93) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
    at org.springframework.integration.handler.AbstractReplyProducingMessageHandler.handleMessageInternal(AbstractReplyProducingMessageHandler.java:109) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
    at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:158) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
    at org.springframework.integration.dispatcher.AbstractDispatcher.tryOptimizedDispatch(AbstractDispatcher.java:116) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
    at org.springframework.integration.dispatcher.UnicastingDispatcher.doDispatch(UnicastingDispatcher.java:132) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
    at org.springframework.integration.dispatcher.UnicastingDispatcher.dispatch(UnicastingDispatcher.java:105) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
    at org.springframework.integration.channel.AbstractSubscribableChannel.doSend(AbstractSubscribableChannel.java:73) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
    at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:445) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
    at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:394) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
    at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:181) ~[spring-messaging-5.0.7.RELEASE.jar!/:5.0.7.RELEASE]
    at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:160) ~[spring-messaging-5.0.7.RELEASE.jar!/:5.0.7.RELEASE]
    at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:47) ~[spring-messaging-5.0.7.RELEASE.jar!/:5.0.7.RELEASE]
    at org.springframework.messaging.core.AbstractMessageSendingTemplate.send(AbstractMessageSendingTemplate.java:108) ~[spring-messaging-5.0.7.RELEASE.jar!/:5.0.7.RELEASE]
    at org.springframework.integration.endpoint.MessageProducerSupport.sendMessage(MessageProducerSupport.java:203) ~[spring-integration-core-5.0.6.RELEASE.jar!/:5.0.6.RELEASE]
    at org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter.access$300(KafkaMessageDrivenChannelAdapter.java:70) ~[spring-integration-kafka-3.0.3.RELEASE.jar!/:3.0.3.RELEASE]
    at org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter$IntegrationRecordMessageListener.onMessage(KafkaMessageDrivenChannelAdapter.java:387) ~[spring-integration-kafka-3.0.3.RELEASE.jar!/:3.0.3.RELEASE]
    at org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter$IntegrationRecordMessageListener.onMessage(KafkaMessageDrivenChannelAdapter.java:364) ~[spring-integration-kafka-3.0.3.RELEASE.jar!/:3.0.3.RELEASE]
    at org.springframework.kafka.listener.adapter.RetryingMessageListenerAdapter.lambda$onMessage$0(RetryingMessageListenerAdapter.java:120) ~[spring-kafka-2.1.7.RELEASE.jar!/:2.1.7.RELEASE]
    at org.springframework.kafka.listener.adapter.RetryingMessageListenerAdapter$$Lambda$659/1406308390.doWithRetry(Unknown Source) ~[na:na]
    at org.springframework.retry.support.RetryTemplate.doExecute(RetryTemplate.java:287) ~[spring-retry-1.2.2.RELEASE.jar!/:na]
    at org.springframework.retry.support.RetryTemplate.execute(RetryTemplate.java:211) ~[spring-retry-1.2.2.RELEASE.jar!/:na]
    at org.springframework.kafka.listener.adapter.RetryingMessageListenerAdapter.onMessage(RetryingMessageListenerAdapter.java:114) ~[spring-kafka-2.1.7.RELEASE.jar!/:2.1.7.RELEASE]
    at org.springframework.kafka.listener.adapter.RetryingMessageListenerAdapter.onMessage(RetryingMessageListenerAdapter.java:40) ~[spring-kafka-2.1.7.RELEASE.jar!/:2.1.7.RELEASE]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeRecordListener(KafkaMessageListenerContainer.java:1071) [spring-kafka-2.1.7.RELEASE.jar!/:2.1.7.RELEASE]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeWithRecords(KafkaMessageListenerContainer.java:1051) [spring-kafka-2.1.7.RELEASE.jar!/:2.1.7.RELEASE]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeRecordListener(KafkaMessageListenerContainer.java:998) [spring-kafka-2.1.7.RELEASE.jar!/:2.1.7.RELEASE]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeListener(KafkaMessageListenerContainer.java:866) [spring-kafka-2.1.7.RELEASE.jar!/:2.1.7.RELEASE]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:724) [spring-kafka-2.1.7.RELEASE.jar!/:2.1.7.RELEASE]
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_45]
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_45]
    at java.lang.Thread.run(Thread.java:745) [na:1.8.0_45]

由于我Content-Type在 HTTP 请求中提供了标头,并且在阅读了这篇博文之后,我假设在消息转换期间消息的有效负载(我理解 Kafka 的默认有线格式是字节 [])然后将被转换为字符串表示。但是,接收的类型Message.payload仍然TransformProcessorConfiguration.transformbyte[].

这种行为是否与标头在调用中Content-Type显示为 a的事实有关?使用调试器单步执行会显示以下标头:NonTrustedHeaderTypeMessagingMessageConverter.toMessage()contentType

headerValue = {"type":"text","subtype":"plain","parameters":{"charset":"UTF-8"},"qualityValue":1.0,"charset":"UTF-8","wildcardType":false,"wildcardSubtype":false,"concrete":true}
untrustedType = "org.springframework.http.MediaType"

rawHeaders这是MessagingMessageConverter解决的列表:

"referer"->"http://localhost:8080"
"content-length"->"17"
"http_requestMethod"->"POST"
"kafka_timestampType"->"CREATE_TIME"
"kafka_receivedMessageKey"->"null"
"kafka_receivedTopic"->"edded.http"
"accept"->"NonTrustedHeaderType
"kafka_offset"->"1"
"scst_nativeHeadersPresent"->"true"
"kafka_consumer"->
"host"->"localhost:9191"
"http_requestUrl"->"http://localhost:9191/"
"kafka_receivedPartitionId"->"0"
"contentType"->"NonTrustedHeaderType
"kafka_receivedTimestamp"->"1531296520235"
"user-agent"->"Mozilla/5.0

我发现的另一个可能相关的问题在此处进行了描述。trustedPackages但是,如果这与我的问题完全相关,我不知道如何通过活页夹属性控制映射器。

我也尝试app.*.spring.cloud.stream.bindings.input.producer.headerMode=raw在部署属性中设置,没有任何效果。

谢谢!

标签: springspring-cloud-streamspring-kafkaspring-cloud-dataflow

解决方案


实际上,您指向的博客不应导致基于内容类型标头进行转换的假设。仅根据处理程序所需的类型进行转换,如果此类类型是泛型(即 Object)或byte[],则不会执行转换。方法的签名是什么TransformProcessorConfiguration.transform(..)?此外,如果您尝试对有效负载进行任何类型的 SPEL 评估,您必须假设它始终是一个,byte[]因为只有在即将调用处理程序方法时才会发生转换,所以如果您condition在有效载荷并假设字符串,不要。


推荐阅读