报错信息如下:
spark02: failed to launch: nice -n 0 /usr/local/softwareInstall/spark-2.1.1-bin-hadoop2.7/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://spark01:7077
spark03: failed to launch: nice -n 0 /usr/local/softwareInstall/spark-2.1.1-bin-hadoop2.7/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://spark01:7077
spark02: JAVA_HOME is not set
spark02: full log in /usr/local/softwareInstall/spark-2.1.1-bin-hadoop2.7/logs/spark-spark-org.apache.spark.deploy.worker.Worker-1-spark02.out
spark03: JAVA_HOME is not set
spark03: full log in /usr/local/softwareInstall/spark-2.1.1-bin-hadoop2.7/logs/spark-spark-org.apache.spark.deploy.worker.Worker-1-spark03.out
摸不着头脑
预算网上求解,终于得到解决方案
The solution had been quite easy and straightforward. Just added export JAVA_HOME=/usr/java/default in /root/.bashrc and it successfully started the spark services from root user without the JAVA_HOME is not set error. Hope it helps somebody facing same problem.
也就是说在每一个worker的root用户下/root/.bashrc的文件后边导入jdk 的路径
然后我就这样试了一下,发现问题居然没有解决,还是报那个错误,然后又找了一会答案还是没找到
于是我就想网上是说在root用户下添加java路径,我用的不是root用户,于是我就在我用的那个用户的目录下的.bashrc文件添加Java路径,然后问题解决。
--参考https://blog.csdn.net/Abandon_Sun/article/details/76686398