• Spark本地配置


    1. 下载解压安装包

    tar -xvf spark-2.0.2-bin-hadoop2.6.tgz

    tar -xvf scala-2.11.8.tgz

    2. 修改Spark配置文件

    cd spark-2.0.2-bin-hadoop2.6/conf/

     vim spark-env.sh

    export SCALA_HOME=/usr/local/src/scala-2.11.8
    export JAVA_HOME=/usr/local/src/jdk1.8.0_221
    export HADOOP_HOME=/usr/local/src/hadoop-2.6.1
    export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
    SPARK_MASTER_IP=master
    SPARK_LOCAL_DIRS=/usr/local/src/spark-2.0.2-bin-hadoop2.6
    SPARK_DRIVER_MEMORY=1G

    vim slaves

    slave1
    slave2

    3. 拷贝安装包到slave1、slave2

    scp -r /usr/local/src/spark-2.0.2-bin-hadoop2.6 root@slave1:/usr/local/src/spark-2.0.2-bin-hadoop2.6

    scp -r /usr/local/src/spark-2.0.2-bin-hadoop2.6 root@slave2:/usr/local/src/spark-2.0.2-bin-hadoop2.6

    4. 启动集群

    先启动hadoop集群

    再启动spark集群

    cd /usr/local/src/spark-2.0.2-bin-hadoop2.6/sbin

    ./shart-all.sh

    5. 网页监控面板:

    master:8080

    6. 验证

    本地模式:./bin/run-example SparkPi 10 --master local[2]

    集群Standlone:

    ./spark-submit --class org.apache.spark.examples.SparkPi --master spark://master:7077 /usr/local/src/spark-2.0.2-bin-hadoop2.6/examples/jars/spark-examples_2.11-2.0.2.jar 100

    集群模式Spark on Yarn:

    ./spark-submit --class org.apache.spark.examples.SparkPi --master yarn-cluster /usr/local/src/spark-2.0.2-bin-hadoop2.6/examples/jars/spark-examples_2.11-2.0.2.jar 10

  • 相关阅读:
    Laravel笔记
    Mysql函数大全
    nginx中文文档
    解析富文本框
    VSCode的C++环境配置,多cpp在同一文件夹(json方式)
    UltraISO光盘刻录
    plog日志库(c++)
    .NET Core安装
    Halcon深度学习——奇异值检测
    C++命名规范
  • 原文地址:https://www.cnblogs.com/bigband/p/13532466.html
Copyright © 2020-2023  润新知