My Code is below
import org.apache.spark.SparkContext;
import org.apache.spark.SparkConf;
object WordCounter {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("Word Counter").setMaster("local")
val sc = new SparkContext(conf)
val textFile = sc.textFile("C:/spark/README.md")
val tokenizedFileData = textFile.flatMap(line=>line.split(" "))
val countPrep = tokenizedFileData.map(word=>(word,1))
val counts = countPrep.reduceByKey((accumValue,newValue)=> accumValue + newValue)
val sortedCounts = counts.sortBy(kvPair=>kvPair._2,false)
sortedCounts.saveAsTextFile("C:/Data/WordCountViaApp")
}
}
Can someone please help?
C:\Users\workspace\SparkInScala>spark-submit --class "WordCounter" "C:\Users\workspace\SparkInScala\target\scala-2.12\sparkinscala_2.12-0.1.0-SNAPSHOT.jar" Error: Failed to load class WordCounter.
I meet the same problem before.
I'm using Maven plug-in of idea to develop Spark Program.Here is my pom.xml
you can try it by replace the mainClass label "testConf" to your class name ,and scala version etc.