• kettle 连接Hadoop 遇错


    kettle从windows中往hdfs中写文件

    One

    2016/07/19 14:14:53 - Spoon - 正在开始任务...
    2016/07/19 14:14:53 - load_hdfs - 开始执行任务
    2016/07/19 14:14:53 - load_hdfs - 开始项[Hadoop Copy Files]
    2016/07/19 14:14:53 - Hadoop Copy Files - 开始...
    2016/07/19 14:14:53 - Hadoop Copy Files - 正在处理行, 源文件/目录: [file:///E:/weblogs_rebuild.txt/weblogs_rebuild.txt] ... 目标文件/目录 : [hdfs://hadoop:8020/data]... 通配符 : [^.*.txt]
    2016/07/19 14:14:53 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
    2016/07/19 14:14:53 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
    2016/07/19 14:14:53 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
    2016/07/19 14:14:53 - Hadoop Copy Files - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : 文件系统异常:Could not copy "file:///E:/weblogs_rebuild.txt/weblogs_rebuild.txt" to "hdfs://hadoop:8020/data/weblogs_rebuild.txt".
    2016/07/19 14:14:53 - Hadoop Copy Files - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : 原因:Could not write to "hdfs://hadoop:8020/data/weblogs_rebuild.txt".
    2016/07/19 14:14:53 - Hadoop Copy Files - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : 原因:Permission denied: user=Administrator, access=WRITE, inode="/data/weblogs_rebuild.txt":root:hadoop:drwxr-xr-x
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:320)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1698)
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1682)
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1665)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2517)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2452)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2335)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:623)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:397)
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2045)
    

     网上一些解决思路

    1.修改服务器上hadoop的配置文件hdfs-site.xml中

    改为false,重启hadoop,但我试了一下,然后从ambari重启集群,发现又变为true了,不知道什么原因

    2.对应目录授权chmod 777,还是报错

    3.最后解决方法:

    hadoop fs -mkdir /user/Administrator
    
    hadoop fs -chown Administrator:hdfs /user/Administrator
    

     Two

    2016/07/20 10:07:03 - Hadoop Copy Files - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : 文件系统异常:Could not copy "file:///E:/Test/red.txt" to "hdfs://hadoop:8020/kettle/red.txt".
    2016/07/20 10:07:03 - Hadoop Copy Files - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : 原因:Could not close the output stream for file "hdfs://hadoop:8020/kettle/red.txt".
    2016/07/20 10:07:03 - Hadoop Copy Files - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : 原因:Connection timed out: no further information
    

    原因:这是在power服务器上就报这样的错,但同样的方式到x86服务器就可以成功。

    具体解决方法:我的另一篇博文Linux启动kettle及linux和windows中kettle往hdfs中写数据(3)http://www.cnblogs.com/womars/p/5718349.html

  • 相关阅读:
    可视化XHTML编辑器
    诺基亚E63内存清理法
    C语言第0次作业
    C语言博客作业03函数
    C博客作业01分支、顺序结构
    C语言博客作业02循环结构
    心情随笔
    沉没成本
    检索了MEG 和EEG,以及棘波的论文
    解决投稿问题
  • 原文地址:https://www.cnblogs.com/zeppelin/p/5685665.html
Copyright © 2020-2023  润新知