Spark SQL实战:使用 case class 创建DataFrame

1.需求:

使用 case class 创建DataFrame

2.数据源:

(1)student.txt
1
2
3
1	tom  15
2 lucy 20
3 mike 18

3.编写代码

(1)添加依赖:

pom.xml

1
2
3
4
5
6
7
8
9
10
11
12
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.0</version>
</dependency>

<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.1.0</version>
</dependency>
(2)Demo2.scala
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
package day1209

import org.apache.spark.sql.SparkSession

/**
* 使用 case class 创建DataFrame
*/
object Demo2 {
def main(args: Array[String]): Unit = {

val spark = SparkSession.builder().master("local").appName("CaseClassDemo").getOrCreate()

val lineRDD = spark.sparkContext.textFile("/users/macbook/TestInfo/student.txt").map(_.split("\t"))

//RDD 和 表结构关联
val studentRDD = lineRDD.map(x => Student(x(0).toInt,x(1),x(2).toInt))

//生成DataFrame
import spark.sqlContext.implicits._
val studentDF = studentRDD.toDF

studentDF.createOrReplaceTempView("student")

spark.sql("select * from student").show

spark.stop()

}
}
//定义 case class 相当于schema
case class Student(stuId:Int,stuName:String,stuAge:Int)

4.结果:

打赏
  • 版权声明: 本博客所有文章除特别声明外,著作权归作者所有。转载请注明出处!
  • Copyrights © 2015-2021 Movle
  • 访问人数: | 浏览次数:

请我喝杯咖啡吧~

支付宝
微信