Apache spark with apache flume integration

99 Views Asked by At

How Spark Streaming can be configured to receive input data from Flume in Java?. (I am stuck in the code) Here is my code:

public static void main(String[] args)
{
Duration batchInterval = new Duration(2000);
System.out.println("-Starting Spark Context"); 
System.out.println("-Spark_home:" + System.getenv("SPARK_HOME")); 
JavaStreamingContext sc = new JavaStreamingContext(master, 
"FlumeEventCount", batchInterval, 
System.getenv("SPARK_HOME"), "/home/cloudera/SparkOnALog.jar"); 
System.out.println("-Setting up Flume Stream: " + host + " " + port); 
JavaDStream<SparkFlumeEvent> flumeStream  


=FlumeUtils.createStream(sc,host, port); 
 flumeStream.count().print(); 
 flumeStream.count().map(new Function<Long, String>() 
 { 
 public String call(Long in) { 
return "????????????? Received " + in + " flume events."; 
} 
}).print(); 
System.out.println("-Starting Spark Context"); 
sc.start(); 
System.out.   println("-Finished"); 
} 
}
0

There are 0 best solutions below