In dsx is there a way to use "display" in a scala 2.11 with Spark 2.0 notebook (I know it can be done in a python notebook with pixiedust). Eg:
display(spark.sql("SELECT COUNT(zip), SUM(pop), city FROM hive_zips_table
WHERE state = 'CA' GROUP BY city ORDER BY SUM(pop) DESC"))
But I want to do the same in a scala notebook. Currently I am just doing a show command below that just give data in a tabular format with no graphics etc.
spark.sql("SELECT COUNT(zip), SUM(pop), city FROM hive_zips_table
WHERE state = 'CA' GROUP BY city ORDER BY SUM(pop) DESC").show()
Note:
Reference:- https://github.com/ibm-cds-labs/pixiedust/wiki
But if you can use Spark 1.6 ,here is a quick way around to use that fancy display function:-
You can go the other way around, Since Pixidust let you use scala and python in one python notebook with %%scala line magic.
https://github.com/ibm-cds-labs/pixiedust/wiki/Using-Scala-language-within-a-Python-Notebook
Step 1. Create a notebook with python 2 and spark 1.6 Install pixidust and import it
Define your variables or your dataframe in Scala under
or
do whatever to create your dataframe
Step 2: In separate cell run following to access df variable in your python shell.
Reference to my sample Notebook:-
Thanks, Charles.