• Spark1.4.1 编译与安装


    1、下载下载地址:
    http://spark.apache.org/downloads.html

    选择下载源码
     

    2、源码编译1)解压
    tar -zxvf spark-1.4.1.tgz
    2、编译
    spark有三种编译方式

    1.SBT编译
    2.Maven编译
    前提:1.JDK 2.Maven 3.Scala
    mvn编译
    mvn clean package
    -DskipTests -Phadoop-2.2
    -Dhadoop.version=2.2.0 -Pyarn -Phive -Phive-thriftserver
    3.生成部署包
    make-distribution编译
    ./make-distribution.sh -tgz
    -Phadoop-2.2 -Dhadoop.version=2.2.0
    -Pyarn
    -Phive-0.13.1 -Phive-thriftserver

    进入根目录下,采用make-distribution.sh进行编译。

    export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m"
    cd spark-1.4.1
    sudo ./make-distribution.sh --tgz --skip-java-test -Pyarn -Phadoop-2.2-Dhadoop.version=2.2.0 -Phive -Phive-thriftserver -DskipTests clean package

    如果中间有报错,请重新跑,多试几次,一般都能成功。

    编译成功后,其安装文件在根目录下:
    spark-1.4.1-bin-2.2.0.tgz

    3、安装省略,和之前版本一样,就不写了。
    4、报错问题集群启动时问题:
    1)问题1 : worek节点不能启动

    localhost:starting org.apache.spark.deploy.worker.Worker, logging to/home/lib/spark-1.4.1/sbin/../logs/org.apache.spark.deploy.worker.Worker-1-is xxxx.out
    localhost:failed to launch org.apache.spark.deploy.worker.Worker:
    localhost:      at org.apache.spark.launcher.SparkClassCommandBuilder.buildCommand(SparkClassCommandBuilder.java:98)
    localhost:      atorg.apache.spark.launcher.Main.main(Main.java:74)
    localhost:full log in/home/lib/spark-1.4.1/sbin/../logs/org.apache.spark.deploy.worker.Worker-1-is xxxx.out
    localhost:Connection to localhost closed.
    原因是系统自带java问题
    rpm -qa | grep java
    gcc-java-4.4.7-4.el6.x86_64
    java_cup-0.10k-5.el6.x86_64

    java-1.5.0-gcj-1.5.0.0-29.1.el6.x86_64

    卸载
    rpm -e --nodeps java_cup-0.10k-5.el6.x86_64
    rpm -e --nodepsjava-1.5.0-gcj-1.5.0.0-29.1.el6.x86_64

    2)问题2 :JAVA_HOME is not set

    localhost: starting org.apache.spark.deploy.worker.Worker, logging to /home/lib/spark-1.4.1/sbin/../logs/spark-org.apache.spark.deploy.worker.Worker-1-is xxxx.out
    localhost: failed to launch org.apache.spark.deploy.worker.Worker:
    localhost:   JAVA_HOME is not set
    localhost: full log in /lib/spark-1.4.1/sbin/../logs/org.apache.spark.deploy.worker.Worker-1-isxxxx.out
    localhost: Connection to localhost closed.

    找到出错的shell文件,加入export JAVA_HOME=... 即可
    spark-env.sh,加入export JAVA_HOME=... 即可


    启动成功后的界面:
     


  • 相关阅读:
    LeetCode--Sudoku Solver
    LeetCode--Merge Intervals
    LeetCode--Valid Number
    LeetCode--Max Points on a Line
    1.1
    智能指针原理与简单实现(转)
    C++内存管理(转)
    算法题--扔棋子
    LeetCode--Substring with Concatenation of All Words
    线性代数与MATALB1
  • 原文地址:https://www.cnblogs.com/ilinuxer/p/5117844.html
Copyright © 2020-2023  润新知