温故而知新,可以为师矣。 ——孔子
public static void filter(){ //创建SparkConf SparkConf conf = new SparkConf().setAppName("map").setMaster("local"); //创建javaSparkContext JavaSparkContext sc = new JavaSparkContext(conf); //插入数据 List<Tuple2<Integer, String>> list = new ArrayList<Tuple2<Integer, String>>(); list.add(new Tuple2<Integer, String>(1,"a")); list.add(new Tuple2<Integer, String>(2,"b")); list.add(new Tuple2<Integer, String>(3,"C")); list.add(new Tuple2<Integer, String>(4,"d")); list.add(new Tuple2<Integer, String>(2,"e")); list.add(new Tuple2<Integer, String>(3,"f")); list.add(new Tuple2<Integer, String>(2,"g")); JavaPairRDD<Integer, String> pair = sc.parallelizePairs(list); //filter筛选 JavaPairRDD<Integer, String> filter = pair.filter(new Function<Tuple2<Integer, String>, Boolean>() { public Boolean call(Tuple2<Integer, String> t) throws Exception { return t._1%2==0; } }); System.out.println(filter.collect()); }
答案: B
;
版权声明:
本文为智客工坊「楠木大叔」的原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接及本声明。