MQTTUtils.createPairedStream() is not a member of org.apache.bahir

131 Views Asked by At

When I started spark-shell by the following command

bin/spark-shell --packages org.apache.bahir:spark-streaming-mqtt_2.11:2.3.0 --repositories http://central.maven.org/maven2/org/apache/bahir/spark-streaming-mqtt_2.11/2.3.0/

Two errors were occurred.

Server access error at url https://central.maven.org/org/apache/bahir/bahir-parent_2.11/2.3.2/bahir-parent_2.11-2.3.2.jar (javax.net.ssl.SSLHandshakeException: java.security.cert.CertificateException: No subject alternative DNS name matching central.maven.org found.)

and

Server access error at url https://central.maven.org/org/apache/bahir/spark-streaming-mqtt_2.11/2.3.2/spark-streaming-mqtt_2.11-2.3.2-javadoc.jar (javax.net.ssl.SSLHandshakeException: java.security.cert.CertificateException: No subject alternative DNS name matching central.maven.org found.)

Here I gave repository as http://central.maven.org/maven2/org/apache/bahir/spark-streaming-mqtt_2.11/2.3.0/ But it automatically connect to https://central.maven.org/org/apache/bahir/bahir-parent_2.11/2.3.2/bahir-parent_2.11-2.3.2.jar which is not present in internet.

How to add these two modules in my spark-shell? My aim is to build a spark-streaming mqtt application, which handles multiple topics.

1

There are 1 best solutions below

1
On

It would be issue with your system. Regarding the error related, there many reasons which causes javax.net.ssl.SSLHandshakeException: java.security.cert.CertificateException. One of reason would be due to the mismatch between the request host URL (which includes the IP address) and the certificate (which usually includes a DNS hostname), the request fails.This is caused by the fact that the certificate is missing aliases (subject alternative names) for the host when the server is accessed with a different name from the default one.

The problem can be resolved in a number of ways. Please find some alternatives in the below links:

https://support.mulesoft.com/s/article/CertificateException-No-Subject-Alternative-Names-Present

https://support.cloudbees.com/hc/en-us/articles/360017693231-Why-am-I-getting-No-subject-alternative-DNS-name-matching-XXX-when-connecting-through-ldaps-

https://confluence.atlassian.com/confkb/java-security-cert-certificateexception-no-subject-alternative-dns-name-matching-hostname-found-452100730.html

https://confluence.atlassian.com/jirakb/java-security-cert-certificateexception-no-subject-alternative-dns-name-matching-hostname-found-297669411.html

I am able to add modules to Spark-Shell.Please find snippet as below.

    C:\Users\XYzUser>spark-shell --repositories http://central.maven.org/maven2/org/apache/bahir/spark-streaming-mqtt_2.11/2.3.0/ --packages org.apache.bahir:spark-streaming-mqtt_2.11:2.3.0
http://central.maven.org/maven2/org/apache/bahir/spark-streaming-mqtt_2.11/2.3.0/ added as a remote repository with the name: repo-1
Ivy Default Cache set to: C:\Users\..\.ivy2\cache
The jars for the packages stored in: C:\Users\..\.ivy2\jars
:: loading settings :: url = jar:file:/C:/Tools/spark/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
org.apache.bahir#spark-streaming-mqtt_2.11 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent-73c724b4-c15c-45a8-89df-f492b2eb6feb;1.0
        confs: [default]
        found org.apache.bahir#spark-streaming-mqtt_2.11;2.3.0 in central
        found org.eclipse.paho#org.eclipse.paho.client.mqttv3;1.1.0 in central
        found org.spark-project.spark#unused;1.0.0 in user-list
:: resolution report :: resolve 7200ms :: artifacts dl 16ms
        :: modules in use:
        org.apache.bahir#spark-streaming-mqtt_2.11;2.3.0 from central in [default]
        org.eclipse.paho#org.eclipse.paho.client.mqttv3;1.1.0 from central in [default]
        org.spark-project.spark#unused;1.0.0 from user-list in [default]
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   3   |   1   |   1   |   0   ||   3   |   0   |
        ---------------------------------------------------------------------

:: problems summary ::
:::: ERRORS
        unknown resolver null


:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
:: retrieving :: org.apache.spark#spark-submit-parent-73c724b4-c15c-45a8-89df-f492b2eb6feb
        confs: [default]
        0 artifacts copied, 3 already retrieved (0kB/31ms)
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://...:4040
Spark context available as 'sc' (master = local[*], app id = local-1552454258705).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.4.0
      /_/