• java.io.IOException: Could not locate executable nullinwinutils.exe in the Hadoop binaries


    在已经搭建好的集群环境Centos6.6+Hadoop2.7+Hbase0.98+Spark1.3.1下,在Win7系统Intellij开发工具中调试Spark读取Hbase。运行直接报错:

    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    15/06/11 15:35:50 ERROR Shell: Failed to locate the winutils binary in the hadoop binary path
    java.io.IOException: Could not locate executable nullinwinutils.exe in the Hadoop binaries.
        at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:356)
        at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:371)
        at org.apache.hadoop.util.Shell.<clinit>(Shell.java:364)
        at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80)
        at org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:611)
        at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:272)
        at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260)
        at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:790)
        at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:760)
        at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:633)
        at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2001)
        at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2001)
        at scala.Option.getOrElse(Option.scala:120)
        at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2001)
        at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:207)
        at org.apache.spark.SparkEnv$.create(SparkEnv.scala:218)
        at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163)
        at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:269)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:272)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:154)
        at SparkFromHbase$.main(SparkFromHbase.scala:15)
        at SparkFromHbase.main(SparkFromHbase.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)

    查看hadoop源码发现里有这么一段:

    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
      public static final String getQualifiedBinPath(String executable)
      throws IOException {
        // construct hadoop bin path to the specified executable
        String fullExeName = HADOOP_HOME_DIR + File.separator + "bin"
          + File.separator + executable;
     
        File exeFile = new File(fullExeName);
        if (!exeFile.exists()) {
          throw new IOException("Could not locate executable " + fullExeName
            + " in the Hadoop binaries.");
        }
     
        return exeFile.getCanonicalPath();
      }
     
    private static String HADOOP_HOME_DIR = checkHadoopHome();
    private static String checkHadoopHome() {
     
        // first check the Dflag hadoop.home.dir with JVM scope
        String home = System.getProperty("hadoop.home.dir");
     
        // fall back to the system/user-global env variable
        if (home == null) {
          home = System.getenv("HADOOP_HOME");
        }
         ...
    }

    很明显应该是HADOOP_HOME的问题。如果HADOOP_HOME为空,必然fullExeName为nullinwinutils.exe。解决方法很简单,配置环境变量,不想重启电脑可以在程序里加上:

    1
    System.setProperty("hadoop.home.dir", "E:\Program Files\hadoop-2.7.0");

    注:E:\Program Files\hadoop-2.7.0是我本机解压的hadoop的路径。

    稍后再执行,你可能还是会出现同样的错误,这个时候你可能会要怪我了。其实一开始我是拒绝的,因为你进入你的hadoop-x.x.x/bin目录下看,你会发现你压根就没有winutils.exe这个东东。

    于是我告诉你,你可以去github下载一个,地球人都知道的地址发你一个。

    地址:https://github.com/srccodes/hadoop-common-2.2.0-bin

    不要顾虑它的版本,不用怕,因为我用的最新的hadoop-2.7.0都没问题!下载好后,把winutils.exe加入你的hadoop-x.x.x/bin下。

  • 相关阅读:
    数据类型
    springboot中get post put delete 请求
    图解SQL的inner join、left join、right join、full outer join、union、union all的区别
    【转】MyBatis之级联——一对一关系
    【转】浏览器同源政策及其规避方法(2)
    【转】浏览器同源政策及其规避方法(1)
    Spring Boot配置文件详解
    【BUG】Spring Mvc使用Jackson进行json转对象时,遇到的字符串转日期的异常处理(JSON parse error: Can not deserialize value of type java.util.Date from String[])
    【转】SpringBoot Mybatis 读取配置文件
    MySQL
  • 原文地址:https://www.cnblogs.com/shizhijie/p/10007866.html
Copyright © 2020-2023  润新知