The following code works with Spark 1.5.2, but not with Spark 2.0.0. I am using Java 1.8.
final SparkConf sparkConf = new SparkConf(); sparkConf.setMaster("local[4]"); // Four threads final JavaSparkContext javaSparkContext = new JavaSparkContext(sparkConf); final JavaRDD<String> javaRDDLines = javaSparkContext.textFile("4300.txt"); final JavaRDD<String> javaRDDWords = javaRDDLines.flatMap(line -> Arrays.asList(line.split(" ")));
I get the following error
Error:(46, 66) java: incompatible types: no instance(s) of type variable(s) T exist so that java.util.List<T> conforms to java.util.Iterator<U>
I can't figure out if the Spark API or anything else has changed. Please help. Thanks.
Vinay source share