首页 > 解决方案 > MQTTUtils.createPairedStream() 不是 org.apache.bahir 的成员

问题描述

当我通过以下命令启动 spark-shell

bin/spark-shell --packages org.apache.bahir:spark-streaming-mqtt_2.11:2.3.0 --repositories http://central.maven.org/maven2/org/apache/bahir/spark-streaming- mqtt_2.11/2.3.0/

发生了两个错误。

url https://central.maven.org/org/apache/bahir/bahir-parent_2.11/2.3.2/bahir-parent_2.11-2.3.2.jar的服务器访问错误(javax.net.ssl.SSLHandshakeException : java.security.cert.CertificateException: 找不到与 central.maven.org 匹配的主题备用 DNS 名称。)

url https://central.maven.org/org/apache/bahir/spark-streaming-mqtt_2.11/2.3.2/spark-streaming-mqtt_2.11-2.3.2-javadoc.jar (javax ) 的服务器访问错误.net.ssl.SSLHandshakeException:java.security.cert.CertificateException:找不到与 central.maven.org 匹配的主题备用 DNS 名称。)

在这里,我将存储库作为http://central.maven.org/maven2/org/apache/bahir/spark-streaming-mqtt_2.11/2.3.0/ 但它会自动连接到https://central.maven.org/ org/apache/bahir/bahir-parent_2.11/2.3.2/bahir-parent_2.11-2.3.2.jar不存在于互联网中。

如何在我的 spark-shell 中添加这两个模块?我的目标是构建一个处理多个主题的 spark-streaming mqtt 应用程序。

标签: scalaspark-streamingmqttiotapache-bahir

解决方案


这将是您的系统的问题。对于相关的错误,导致的原因有很多javax.net.ssl.SSLHandshakeException: java.security.cert.CertificateException。原因之一是由于请求主机 URL(包括 IP 地址)和证书(通常包括 DNS 主机名)不匹配,请求失败。这是由于证书缺少别名(当使用与默认名称不同的名称访问服务器时,主机的主题备用名称。

该问题可以通过多种方式解决。请在以下链接中找到一些替代方案:

https://support.mulesoft.com/s/article/CertificateException-No-Subject-Alternative-Names-Present

https://support.cloudbees.com/hc/en-us/articles/360017693231-Why-am-I-getting-No-subject-alternative-DNS-name-matching-XXX-when-connecting-through-ldaps-

https://confluence.atlassian.com/confkb/java-security-cert-certificateexception-no-subject-alternative-dns-name-matching-hostname-found-452100730.html

https://confluence.atlassian.com/jirakb/java-security-cert-certificateexception-no-subject-alternative-dns-name-matching-hostname-found-297669411.html

我可以将模块添加到 Spark-Shell。请找到如下代码段。

    C:\Users\XYzUser>spark-shell --repositories http://central.maven.org/maven2/org/apache/bahir/spark-streaming-mqtt_2.11/2.3.0/ --packages org.apache.bahir:spark-streaming-mqtt_2.11:2.3.0
http://central.maven.org/maven2/org/apache/bahir/spark-streaming-mqtt_2.11/2.3.0/ added as a remote repository with the name: repo-1
Ivy Default Cache set to: C:\Users\..\.ivy2\cache
The jars for the packages stored in: C:\Users\..\.ivy2\jars
:: loading settings :: url = jar:file:/C:/Tools/spark/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
org.apache.bahir#spark-streaming-mqtt_2.11 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent-73c724b4-c15c-45a8-89df-f492b2eb6feb;1.0
        confs: [default]
        found org.apache.bahir#spark-streaming-mqtt_2.11;2.3.0 in central
        found org.eclipse.paho#org.eclipse.paho.client.mqttv3;1.1.0 in central
        found org.spark-project.spark#unused;1.0.0 in user-list
:: resolution report :: resolve 7200ms :: artifacts dl 16ms
        :: modules in use:
        org.apache.bahir#spark-streaming-mqtt_2.11;2.3.0 from central in [default]
        org.eclipse.paho#org.eclipse.paho.client.mqttv3;1.1.0 from central in [default]
        org.spark-project.spark#unused;1.0.0 from user-list in [default]
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   3   |   1   |   1   |   0   ||   3   |   0   |
        ---------------------------------------------------------------------

:: problems summary ::
:::: ERRORS
        unknown resolver null


:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
:: retrieving :: org.apache.spark#spark-submit-parent-73c724b4-c15c-45a8-89df-f492b2eb6feb
        confs: [default]
        0 artifacts copied, 3 already retrieved (0kB/31ms)
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://...:4040
Spark context available as 'sc' (master = local[*], app id = local-1552454258705).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.4.0
      /_/

推荐阅读