• spark standalone集群模式下一个启动问题的解决


    spark standalone集群配置好后,启动sbin/start-all.sh报错,其中一个worker没有正常启动起来,查看此worker上的sparklogs目录下的 log文件,有显示如下的错误

    20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
    20/04/01 02:46:08 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[main,5,main]
    java.net.BindException: Cannot assign requested address: Service 'sparkWorker' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkWorker' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:433)
    at sun.nio.ch.Net.bind(Net.java:425)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:128)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:558)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1283)
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:501)
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:486)
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:989)
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254)
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:364)
    at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
    at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
    at java.lang.Thread.run(Thread.java:748)

    从上面红色字体部分,可以判断是地址绑定错误,先检查此worker的ip地址:ifconfig,确认ip地址为:192.28.12.243

    然后检查此worker的/etc/hosts 文件,发现其ip配置为:

    179.28.120.243   dxhost8002

    由此确定是hosts文件的ip地址配置错误,导致worker启动异常,修改hosts文件为正确的ip地址:192.28.12.243,然后重新启动spark,一切恢复正常

  • 相关阅读:
    POJ3480 John 博弈论 anti-nim anti-SG
    POJ2068 Nim 博弈论 dp
    POJ 1740 A New Stone Game 又是博弈论配对找规律orz 博弈论 规律
    Python复习之下划线的含义
    django 模板语法和三种返回方式
    Python自动化之一对多
    Python自动化之django的ORM
    Python自动化之django的ORM操作——Python源码
    django orm字段和参数
    Python自动化之django视图
  • 原文地址:https://www.cnblogs.com/benfly/p/12613261.html
Copyright © 2020-2023  润新知