How to connect to SQL Server using Pyspark which is running on GCP without using Secure Sockets Layer?

201 Views Asked by At

I am trying to connect to a SQL Server database using PySpark as below:

from pyspark.sql import SparkSession
import traceback

def connect_and_read(spark: SparkSession):
    url = 'jdbc:sqlserver://DUMMY1234.DUMMY.COM\DUMMY1234;databaseName=Dummy_DB;encrypt=false'
    driver = 'com.microsoft.sqlserver.jdbc.SQLServerDriver'
    try:
        dataframe = spark.read.format('jdbc').option('url', url). \
            option('driver', driver). \
            option('user', 'username'). \
            option('password', 'password'). \
            option('dbtable', 'TABLENAME'). \
            load()
        print(f'Count: {dataframe.count()}')
        dataframe.take(10)
    except Exception as ex:
        traceback.print_exc(type(ex), ex, ex.__traceback__)
    pass


if __name__ == '__main__':
    spark = SparkSession.builder.master('yarn').config('spark.app.name', 'read_data_sqlserver').config('spark.driver.extraClassPath', 'path_to_/mssql-jdbc-9.2.0.jre8.jar').config('spark.driver.extraClassPath', 'path_to_/spark-mssql-connector-1.0.1.jar').getOrCreate()
    connect_and_read(spark)

I am running this code from Google Cloud Platform. I have a Dataproc instance where I created a cluster for this operation and submitting my job there. The job fails with below exception:

py4j.protocol.Py4JJavaError: An error occurred while calling o70.load.
: com.microsoft.sqlserver.jdbc.SQLServerException: The driver could not establish a secure connection to SQL Server by using Secure Sockets Layer (SSL) encryption. Error: "Connection reset ClientConnectionId:1223412f-9879702-wfwd-134qq-2143d123e1q".

at com.microsoft.sqlserver.jdbc.SQLServerConnection.terminate(SQLServerConnection.java:3208)
at com.microsoft.sqlserver.jdbc.TDSChannel.enableSSL(IOBuffer.java:1916)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.connectHelper(SQLServerConnection.java:2760)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.login(SQLServerConnection.java:2418) at com.microsoft.sqlserver.jdbc.SQLServerConnection.connectInternal(SQLServerConnection.java:2265) at com.microsoft.sqlserver.jdbc.SQLServerConnection.connect(SQLServerConnection.java:1291) at com.microsoft.sqlserver.jdbc.SQLServerDriver.connect(SQLServerDriver.java:881) at java.lang.Thread.run(Thread.java:748) Caused by: java.io.IOException: Connection reset ClientConnectionId:1223412f-9879702-wfwd-134qq-2143d123e1q. at com.microsoft.sqlserver.jdbc.TDSChannel$SSLHandshakeInputStream.readInternal(IOBuffer.java:862) at com.microsoft.sqlserver.jdbc.TDSChannel$SSLHandshakeInputStream.read(IOBuffer.java:849) at com.microsoft.sqlserver.jdbc.TDSChannel$ProxyInputStream.readInternal(IOBuffer.java:1019) at com.microsoft.sqlserver.jdbc.TDSChannel$ProxyInputStream.read(IOBuffer.java:1009) at org.conscrypt.ConscryptEngineSocket$SSLInputStream.readFromSocket(ConscryptEngineSocket.java:920) at org.conscrypt.ConscryptEngineSocket$SSLInputStream.processDataFromSocket(ConscryptEngineSocket.java:884) at org.conscrypt.ConscryptEngineSocket$SSLInputStream.access$100(ConscryptEngineSocket.java:706) at org.conscrypt.ConscryptEngineSocket.doHandshake(ConscryptEngineSocket.java:230) at org.conscrypt.ConscryptEngineSocket.startHandshake(ConscryptEngineSocket.java:209) at com.microsoft.sqlserver.jdbc.TDSChannel.enableSSL(IOBuffer.java:1824) ... 28 more During handling of the above exception, another exception occurred:

Traceback (most recent call last):

File "/tmp/portw/pattern.py", line 24, in
connect_and_read(spark)

File "/tmp/portw/pattern.py", line 18, in connect_and_read
traceback.print_exc(type(ex), ex, ex.traceback)

File "/opt/conda/default/lib/python3.8/traceback.py", line 163, in print_exc
print_exception(*sys.exc_info(), limit=limit, file=file, chain=chain)

File "/opt/conda/default/lib/python3.8/traceback.py", line 103, in print_exception
for line in TracebackException(

File "/opt/conda/default/lib/python3.8/traceback.py", line 509, in init
self.stack = StackSummary.extract(

File "/opt/conda/default/lib/python3.8/traceback.py", line 340, in extract
if limit >= 0:

TypeError: '>=' not supported between instances of 'type' and 'int'

21/02/09 13:59:59 INFO org.sparkproject.jetty.server.AbstractConnector: Stopped Spark@1aa73a6d{HTTP/1.1, (http/1.1)}

Earlier, the URL didn't contain encrypt=false and I added it after looking at some references.

I am able to connect to the same host for my APIs in plain python code but not in Spark.

Could anyone let me know what is the mistake I am making here and how to correct it.

Any help is appreciated.

0

There are 0 best solutions below