欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页

Spark算子实现WordCount

程序员文章站 2022-03-27 14:14:02
...

1 map + reduceByKey

    sparkContext.textFile("hdfs://ifeng:9000/hdfsapi/wc.txt")
        .flatMap(_.split(","))
        .map((_,1))
        .reduceByKey(_+_).collect()

Spark算子实现WordCount

2 countByValue代替map + reduceByKey

val RDDfile = sparkContext.textFile("hdfs://ifeng:9000/hdfsapi/wc.txt")
RDDfile.flatMap(_.split(",")).countByValue.foreach(println)

Spark算子实现WordCount

3 aggregateByKey


RDDfile.flatMap(_.split(",")).map((_,1)).aggregateByKey(0)(_ + _, _ + _).collect().foreach(println)

Spark算子实现WordCount

4 foldByKey


RDDfile.flatMap(_.split(",")).map((_,1)).foldByKey(0)(_ + _).collect().foreach(println)

Spark算子实现WordCount

5 groupByKey+map

RDDfile.flatMap(_.split(",")).map((_, 1)).groupByKey().map(tuple => {
      (tuple._1, tuple._2.sum)
    }).collect().foreach(println)

Spark算子实现WordCount

6 combineByKey

RDDfile.flatMap(_.split(",")).map((_, 1)).combineByKey(
      x => x,
      (x: Int, y: Int) => x + y,
      (x: Int, y: Int) => x + y
    ).collect().foreach(println)

Spark算子实现WordCount

相关标签: 爬梯