• ERROR [org.apache.hadoop.security.UserGroupInformation]


    换了个环境,出现此异常
    016-10-18 23:54:01,334 WARN [org.apache.hadoop.util.NativeCodeLoader] - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    2016-10-18 23:54:01,668 INFO [org.apache.hadoop.conf.Configuration.deprecation] - session.id is deprecated. Instead, use dfs.metrics.session-id
    2016-10-18 23:54:01,669 INFO [org.apache.hadoop.metrics.jvm.JvmMetrics] - Initializing JVM Metrics with processName=JobTracker, sessionId=
    2016-10-18 23:54:01,738 ERROR [org.apache.hadoop.security.UserGroupInformation] - PriviledgedActionException as:selfimpr (auth:SIMPLE) cause:org.apache.hadoop.util.Shell$ExitCodeException:
    Exception in thread "main" org.apache.hadoop.util.Shell$ExitCodeException:
    at org.apache.hadoop.util.Shell.runCommand(Shell.java:464)
    at org.apache.hadoop.util.Shell.run(Shell.java:379)
    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:589)
    at org.apache.hadoop.util.Shell.execCommand(Shell.java:678)
    at org.apache.hadoop.util.Shell.execCommand(Shell.java:661)
    at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:639)
    at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:435)
    at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:277)
    at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:125)
    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:344)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Unknown Source)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)
    at com.dajiangtai.hadoop.test.WordCount.main(WordCount.java:62)

    困扰了许久,发现需要在system32下面把 winutils 和hadoop.dll 加进去才可以

  • 相关阅读:
    2.7连接数据库中遇见的相应问题1
    linux bash中too many arguments问题的解决方法
    linux系统补丁更新 yum命令
    安装node,linux升级gcc
    python-导出Jenkins任务
    升级openssl和openssh版本
    linux修改文件所属的用户组以及用户
    linux的Umask 为022 和027 都是什么意思?
    keepalived
    自己编写k8s
  • 原文地址:https://www.cnblogs.com/qiaoyihang/p/6166154.html
Copyright © 2020-2023  润新知