I'm just getting started with EMR Hadoop/spark etc., I am trying to use spark-shell to run a scala code to upload a file to EMRFS S3 location however I am receiving below error -
Without any Import If I run =>
val bucketName = "bucket"
val outputPath = "test.txt"
scala> val putRequest = PutObjectRequest.builder.bucket(bucketName).key(outputPath).build()
<console>:27: error: not found: value PutObjectRequest
val putRequest = PutObjectRequest.builder.bucket(bucketName).key(outputPath).build()
^
Once I add the Import package for PutObjectRequest I still get a different error.
scala> import com.amazonaws.services.s3.model.PutObjectRequest
import com.amazonaws.services.s3.model.PutObjectRequest
scala> val putRequest = PutObjectRequest.builder.bucket(bucketName).key(outputPath).build()
<console>:28: error: value builder is not a member of object com.amazonaws.services.s3.model.PutObjectRequest
val putRequest = PutObjectRequest.builder.bucket(bucketName).key(outputPath).build()
^
I'm not sure what I am missing. Any help would be appreciated!
Note: Spark version is 2.4.5
Instead of using the builder create the object of PutObjectRequest via a suitable constructor. Also, create a connection to S3 using AmazonS3ClientBuilder.