How do I access a postgres table from pyspark on IBM's Data Science Experience?

461 Views Asked by At

Here is my code:

uname = "xxxxx" 
pword = "xxxxx" 
dbUrl = "jdbc:postgresql:dbserver" 
table = "xxxxx"
jdbcDF = spark.read.format("jdbc").option("url", dbUrl).option("dbtable",table).option("user", uname).option("password", pword).load()

I'm getting a "No suitable driver" error after adding the postgres driver jar (%Addjar -f https://jdbc.postgresql.org/download/postgresql-9.4.1207.jre7.jar). Is there a working example of loading data from postgres in pyspark 2.0 on DSX?

2

There are 2 best solutions below

0
On BEST ANSWER

Please make use of pixiedust package manager to install the postgres driver on spark service level.

http://datascience.ibm.com/docs/content/analyze-data/Package-Manager.html

Since Pixiedust is only supported on spark 1.6 , run

pixiedust.installPackage("https://jdbc.postgresql.org/download/postgresql-9.4.1207.jre7.jar")

Once you install this, restart kernel and then Switch to spark 2.0 to run your postgres connection to get spark dataframe using sparksession.

uname = "username"

pword = "xxxxxx"

dbUrl = "jdbc:postgresql://hostname:10635/compose?user="+uname+"&password="+pword

table = "tablename"

Df = spark.read.format('jdbc').options(url=dbUrl,database='compose',dbtable=table).load()

houseDf.take(1)

Working Notebook:-

https://apsportal.ibm.com/analytics/notebooks/8b220408-6fc7-48a9-8350-246fbbf10ac8/view?access_token=7297af80b2e4109087a78365e7df3205f6ed9d0840c0c46d2208bc00ed0b0274

Thanks, Charles.

1
On

Just provide driver option

option("driver", "org.postgresql.Driver")