• Spark cluster on Mesos


    Spark cluster on Mesos


    官方网站:


    环境:
    CentOS 7
    spark-2.0
    jdk-1.8
    mesos-1.0
    Spark <wbr>cluster <wbr>on <wbr>Mesos


    一.mesos

    zk1: 192.168.8.101

    zk2: 192.168.8.102

    zk3: 192.168.8.103

    mesos-m1: 192.168.8.101

    mesos-m2: 192.168.8.102

    mesos-m3: 192.168.8.103

    mesos-a1: 192.168.8.101

    mesos-a2: 192.168.8.102

    mesos-a3: 192.168.8.103

    具体请参看mesos+marathon+docker


    二.spark
    1.配置jdk
    JAVA_HOME=/opt/jdk
    2.安装spark
    tar -xvf spark-2.0.0-bin-hadoop2.7.tgz -C /opt/
    mv /opt/spark-2.0.0-bin-hadoop2.7 /opt/spark
    cp /opt/spark/conf/log4j.properties.template /opt/spark/conf/log4j.properties
    sed -i 's/INFO/WARN/g' /opt/spark/conf/log4j.properties
    设置spark环境变量
    cat >/etc/profile.d/spark.sh <<HERE

    export SPARK_HOME=/opt/spark

    HERE

    source /etc/profile

    root@router:~#/opt/spark/bin/

    beeline       run-example   sparkR        spark-sql     

    pyspark       spark-class   spark-shell   spark-submit 


    提示:请确保主机名可以成功解析

    pyspark

    root@router:~#/opt/spark/bin/pyspark 

    Python 2.7.5 (default, Nov 20 2015, 02:00:19) 

    [GCC 4.8.5 20150623 (Red Hat 4.8.5-4)] on linux2

    Type "help", "copyright", "credits" or "license" for more information.

    16/08/02 17:55:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

    Welcome to

          ____              __

         / __/__  ___ _____/ /__

        _ / _ / _ `/ __/  '_/

       /__ / .__/\_,_/_/ /_/\_   version 2.0.0

          /_/


    Using Python version 2.7.5 (default, Nov 20 2015 02:00:19)

    SparkSession available as 'spark'.

    >>> 

    spark-shell

    root@router:~#/opt/spark/bin/spark-shell 

    16/08/02 18:00:25 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

    16/08/02 18:00:26 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect.

    Spark context Web UI available at http://192.168.8.254:4040

    Spark context available as 'sc' (master = local[*], app id = local-1470132026404).

    Spark session available as 'spark'.

    Welcome to

          ____              __

         / __/__  ___ _____/ /__

        _ / _ / _ `/ __/  '_/

       /___/ .__/\_,_/_/ /_/\_   version 2.0.0

          /_/

             


    Spark <wbr>cluster <wbr>on <wbr>Mesos

    3.配置spark集群

    http://spark.apache.org/docs/latest/running-on-mesos.html

    spark1: 192.168.8.101

    spark2: 192.168.8.102

    spark3: 192.168.8.103

    /opt/spark/sbin/start-mesos-dispatcher.sh --master mesos://zk://192.168.8.101:2181,192.168.8.102:2181,192.168.8.103:2181/mesos 

    Spark <wbr>cluster <wbr>on <wbr>Mesos

    4.提交job到spark cluster

    这里直接借用spark自带example示例

    /opt/spark/bin/spark-submit

      --class org.apache.spark.examples.SparkPi

      --master mesos://192.168.8.102:7077

      --deploy-mode cluster

      --supervise

      --executor-memory 1G

      --total-executor-cores 100

      http://192.168.8.254/ftp/examples/src/main/python/pi.py

      1000

    /opt/spark/bin/spark-submit

      --class org.apache.spark.examples.SparkPi

      --master mesos://192.168.8.102:7077

      --deploy-mode cluster

      --supervise

      --executor-memory 1G

      --total-executor-cores 100

      http://192.168.8.254/ftp/examples/jars/spark-examples_2.11-2.0.0.jar 

      1000


    提示:

    执行的job需要http://, hdfs://等形式

    Spark <wbr>cluster <wbr>on <wbr>Mesos

  • 相关阅读:
    MySQL字符集编码相关
    Python基础(2):__doc__、文档字符串docString、help()
    Python基础(1):dir(),help()
    Python开发环境(3):使用Eclipse+PyDev插件创建Django项目
    Python开发环境(2):启动Eclipse时检测到PYTHONPATH发生改变
    使用免安装压缩包安装MySQL
    第一个Django项目:HelloWorld
    Django 2.0.3安装-压缩包方式
    Eclipse中各种编码格式及设置
    Python开发环境(1):Eclipse+PyDev插件
  • 原文地址:https://www.cnblogs.com/lixuebin/p/10814028.html
Copyright © 2020-2023  润新知