Problem with importing file into databricks notebook

748 Views Asked by At

I'm trying to import a .txt file from my local computer into a databricks notebook (scala). I go into the data section and click "add data". The "upload file" option is selected in the top panel, I click browse to find the file and then "Create table in Notebook". I then get a path to the file but when I try to access the data in another notebook through the command "val file_location = "/FileStore/tables/....txt" I get a "java.io.FileNotFoundException: /FileStore/tables/....txt (No such file or directory)". Does anybody know what I'm doing wrong here and what I should do instead?

Kind regards

1

There are 1 best solutions below

0
On

Format should be text, not txt. See the documentation:

scala> val df = spark.read.format("text").load("README.md")
df: org.apache.spark.sql.DataFrame = [value: string]

scala> df.count
res0: Long = 104

Or you can use spark.read.textFile function that is really shortcut for that:

scala> spark.read.textFile("README.md").count
res1: Long = 104