• spark处理jsonFile


    按照spark的说法,这里的jsonFile是特殊的文件:

    Note that the file that is offered as jsonFile is not a typical JSON file. Each line must contain a separate, self-contained valid JSON object. As a consequence, a regular multi-line JSON file will most often fail.

    它是按行分隔多个JSON对象,否则的话就会出错。

    以下是一个jsonFile的内容:

    scala> val path = "examples/src/main/resources/people.json"
    path: String = examples/src/main/resources/people.json
    
    scala> Source.fromFile(path).foreach(print)
    {"name":"Michael"}
    {"name":"Andy", "age":30}
    {"name":"Justin", "age":19}

    可以获取到一个SchemaRDD:

    scala> val sqlContext = new org.apache.spark.sql.SQLContext(sc)
    scala> val jsonFile = sqlContext.jsonFile(path)
    scala> jsonFile.printSchema()
    root
     |-- age: integer (nullable = true)
     |-- name: string (nullable = true)

    针对该SchemaRDD可以做遍历操作:

    jsonFile.filter(row=>{val age=row(0).asInstanceOf[Int];age>=13&&age<=19}).collect

    既然是SchemaRDD,就可以采用SQL:

    scala> jsonFile.registerTempTable("people")
    scala> val teenagers = sqlContext.sql("SELECT name FROM people WHERE age >= 13 AND age <= 19")
    scala> teenagers.foreach(println)
  • 相关阅读:
    软件开发术语
    网络规划与设计
    MPLS LDP协议
    MPLS 基础
    CallAfter
    LongRunningTasks
    Non-blocking GUI
    WorkingWithThreads
    Python: Running Ping, Traceroute and More
    wxPython and Threads
  • 原文地址:https://www.cnblogs.com/bluejoe/p/5115851.html
Copyright © 2020-2023  润新知