• cdh安装spark遇到的几个BUG


    spark安装后启动:

    [zdwy@master spark]$ sbin/start-all.sh
    starting org.apache.spark.deploy.master.Master, logging to /home/zdwy/cdh5.9/spark/logs/spark-zdwy-org.apache.spark.deploy.master.Master-1-master.out
    failed to launch org.apache.spark.deploy.master.Master:
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 7 more
    full log in /home/zdwy/cdh5.9/spark/logs/spark-zdwy-org.apache.spark.deploy.master.Master-1-master.out
    slave1: starting org.apache.spark.deploy.worker.Worker, logging to /home/zdwy/cdh5.9/spark/logs/spark-zdwy-org.apache.spark.deploy.worker.Worker-1-slave1.out
    slave2: starting org.apache.spark.deploy.worker.Worker, logging to /home/zdwy/cdh5.9/spark/logs/spark-zdwy-org.apache.spark.deploy.worker.Worker-1-slave2.out
    slave1: failed to launch org.apache.spark.deploy.worker.Worker:
    slave1: at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    slave1: ... 6 more
    slave1: full log in /home/zdwy/cdh5.9/spark/logs/spark-zdwy-org.apache.spark.deploy.worker.Worker-1-slave1.out
    slave2: failed to launch org.apache.spark.deploy.worker.Worker:
    slave2: at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    slave2: ... 6 more
    slave2: full log in /home/zdwy/cdh5.9/spark/logs/spark-zdwy-org.apache.spark.deploy.worker.Worker-1-slave2.out

    原因:缺少hadoop和spark之间通信的jar包

    解决方案:下载3个jar包:jackson-core-xxx.jar,jackson-annotations-xxx.jar,jackson-databind-xxx.jar,下载地址:http://mvnrepository.com/artifact/com.fasterxml.jackson.core/

    下载后将jar包放入到hadoop/share/hadoop/commom/目录下,重新启动spark即可。

  • 相关阅读:
    Perl 简介
    一定时间后延时变长问题
    CPAN常见问题集
    J2SE简介
    brian的Perl问题之万能指南
    清洁工 VS 亿万富翁
    关于VC中的"stdafx.h"
    Perl模式匹配
    wiki介绍
    生活中10大省钱小秘诀 白领一族"必备诀窍"
  • 原文地址:https://www.cnblogs.com/zhaojinyan/p/9599521.html
Copyright © 2020-2023  润新知