• hue报错StructuredException: timed out (code THRIFTSOCKET): None的处理


    通过hue的web界面进行hive的sql查询,无法显示结果并报错timeout

    报错如下:
    [28/Jul/2017 11:23:29 +0800] decorators ERROR error running <function execute at 0x7fa741ddc8c0>
    Traceback (most recent call last):
    File "/home/hadoop/.versions/hue-3.10.0/desktop/libs/notebook/src/notebook/decorators.py", line 81, in decorator
    return func(*args, **kwargs)
    File "/home/hadoop/.versions/hue-3.10.0/desktop/libs/notebook/src/notebook/api.py", line 109, in execute
    response['handle'] = get_api(request, snippet).execute(notebook, snippet)
    File "/home/hadoop/.versions/hue-3.10.0/desktop/libs/notebook/src/notebook/connectors/hiveserver2.py", line 64, in decorator
    return func(*args, **kwargs)
    File "/home/hadoop/.versions/hue-3.10.0/desktop/libs/notebook/src/notebook/connectors/hiveserver2.py", line 199, in execute
    db.use(query.database)
    File "/home/hadoop/.versions/hue-3.10.0/apps/beeswax/src/beeswax/server/dbms.py", line 613, in use
    return self.client.use(query)
    File "/home/hadoop/.versions/hue-3.10.0/apps/beeswax/src/beeswax/server/hive_server2_lib.py", line 1053, in use
    data = self._client.execute_query(query)
    File "/home/hadoop/.versions/hue-3.10.0/apps/beeswax/src/beeswax/server/hive_server2_lib.py", line 748, in execute_query
    return self.execute_query_statement(statement=query.query['query'], max_rows=max_rows, configuration=configuration)
    File "/home/hadoop/.versions/hue-3.10.0/apps/beeswax/src/beeswax/server/hive_server2_lib.py", line 752, in execute_query_statement
    (results, schema), operation_handle = self.execute_statement(statement=statement, max_rows=max_rows, configuration=configuration, orientation=orientation)
    File "/home/hadoop/.versions/hue-3.10.0/apps/beeswax/src/beeswax/server/hive_server2_lib.py", line 780, in execute_statement
    res = self.call(self._client.ExecuteStatement, req)
    File "/home/hadoop/.versions/hue-3.10.0/apps/beeswax/src/beeswax/server/hive_server2_lib.py", line 625, in call
    res = fn(req)
    File "/home/hadoop/.versions/hue-3.10.0/desktop/core/src/desktop/lib/thrift_util.py", line 376, in wrapper
    raise StructuredException('THRIFTSOCKET', str(e), data=None, error_code=502)
    StructuredException: timed out (code THRIFTSOCKET): None

    解决方法:
    修改配置文件

    /home/hadoop/.versions/hue-3.10.0/desktop/conf/hue.ini
    server_conn_timeout=120
    改为,重启hue服务,好像没用
    server_conn_timeout=1200

    搜索了下网上说,如果通过Hue的web来执行Hive查询,需要启动HiveServer2服务

    查看配置conf/hue.ini,连接的是uhadoop-bwgkeu-master2的10000端口,连接这个master2通过ps -ef|grep hive发现已经启动了的,于是重启服务,问题解决

    [hadoop@uhadoop-bwgkeu-master2 log]$ ps -ef|grep hive
    hadoop 3730 1 4 14:05 ? 00:00:19 /usr/java/jdk1.7.0_60/bin/java -Xmx1024m -Dhive.log.dir=/var/log/hive -Dhive.log.file=hive-server2.log -Dhive.log.threshold=INFO -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/hadoop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/hadoop -Dhadoop.id.str=hadoop -Dhadoop.root.logger=INFO,console -Djava.library.path=/home/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Xmx1000m -Xmx4096m -Xms1024m -XX:NewRatio=12 -XX:MaxHeapFreeRatio=40 -XX:MinHeapFreeRatio=15 -XX:+UseParNewGC -XX:-UseGCOverheadLimit -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /home/hadoop/hive/lib/hive-service-1.2.1.jar org.apache.hive.service.server.HiveServer2 --hiveconf hive.aux.jars.path=file:///home/hadoop/hbase/lib/activation-1.1.jar,file:///home/hadoop/hbase/lib/antisamy-1.4.3.jar,file:///home/hadoop/hbase/lib/apacheds-i18n-2.0.0-M15.jar,file:///home/hadoop/hbase/lib/apacheds-kerberos-codec-2.0.0-M15.jar,file:///home/hadoop/hbase/lib/api-asn1-api-1.0.0-M20.jar,file:///home/hadoop/hbase/lib/api-util-1.0.0-M20.jar,file:///home/hadoop/hbase/lib/asm-3.2.jar,file:///home/hadoop/hbase/lib/avro-1.7.6-cdh5.8.0.jar,file:///home/hadoop/hbase/lib/aws-java-sdk-core-1.10.6.jar,file:///home/hadoop/hbase/lib/aws-java-sdk-kms-1.10.6.jar,file:///home/hadoop/hbase/lib/aws-java-sdk-s3-1.10.6.jar,file:///home/hadoop/hbase/lib/batik-css-1.7.jar,file:///home/hadoop/hbase/lib/batik-ext-1.7.jar,file:///home/hadoop/hbase/lib/batik-util-1.7.jar,file:///home/hadoop/hbase/lib/bsh-core-2.0b4.jar,file:///home/hadoop/hbase/lib/commons-beanutils-1.7.0.jar,file:///home/hadoop/hbase/lib/commons-beanutils-core-1.7.0.jar,file:///home/hadoop/hbase/lib/commons-cli-1.2.jar,file:///home/hadoop/hbase/lib/commons-codec-1.9.jar,file:///home/hadoop/hbase/lib/commons-collections-3.2.2.jar,file:///home/hadoop/hbase/lib/commons-compress-1.4.1.jar,file:///home/hadoop/hbase/lib/commons-configuration-1.6.jar,file:///home/hadoop/hbase/lib/commons-daemon-1.0.3.jar,file:///home/hadoop/hbase/lib/commons-digester-1.8.jar,file:///home/hadoop/hbase/lib/commons-el-1.0.jar,file:///home/hadoop/hbase/lib/commons-fileupload-1.2.jar,file:///home/hadoop/hbase/lib/commons-httpclient-3.1.jar,file:///home/hadoop/hbase/lib/commons-io-2.4.jar,file:///home/hadoop/hbase/lib/commons-lang-2.6.jar,file:///home/hadoop/hbase/lib/commons-logging-1.2.jar,file:///home/hadoop/hbase/lib/commons-math-2.1.jar,file:///home/hadoop/hbase/lib/commons-math3-3.1.1.jar,file:///home/hadoop/hbase/lib/commons-net-3.1.jar,file:///home/hadoop/hbase/lib/core-3.1.1.jar,file:///home/hadoop/hbase/lib/curator-client-2.7.1.jar,file:///home/hadoop/hbase/lib/curator-framework-2.7.1.jar,file:///home/hadoop/hbase/lib/curator-recipes-2.7.1.jar,file:///home/hadoop/hbase/lib/disruptor-3.3.0.jar,file:///home/hadoop/hbase/lib/esapi-2.1.0.jar,file:///home/hadoop/hbase/lib/findbugs-annotations-1.3.9-1.jar,file:///home/hadoop/hbase/lib/gson-2.2.4.jar,file:///home/hadoop/hbase/lib/guava-12.0.1.jar,file:///home/hadoop/hbase/lib/hadoop-annotations-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-auth-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-aws-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-client-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-common-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-hdfs-2.6.0-cdh5.4.9-tests.jar,file:///home/hadoop/hbase/lib/hadoop-hdfs-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-mapreduce-client-app-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-mapreduce-client-common-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-mapreduce-client-core-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-mapreduce-client-jobclient-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-mapreduce-client-shuffle-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-yarn-api-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-yarn-client-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-yarn-common-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-yarn-server-commo
    hadoop 6471 1 0 Jun28 ? 00:18:22 /usr/java/jdk1.7.0_60/bin/java -Xmx1024m -Dhive.log.dir=/var/log/hive -Dhive.log.file=hive-metastore.log -Dhive.log.threshold=INFO -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/hadoop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/hadoop -Dhadoop.id.str=hadoop -Dhadoop.root.logger=INFO,console -Djava.library.path=/home/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Xmx1000m -Xmx4096m -Xms1024m -XX:NewRatio=12 -XX:MaxHeapFreeRatio=40 -XX:MinHeapFreeRatio=15 -XX:+UseParNewGC -XX:-UseGCOverheadLimit -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /home/hadoop/hive/lib/hive-service-1.2.1.jar org.apache.hadoop.hive.metastore.HiveMetaStore
    hadoop 8800 1 0 Jun28 ? 00:44:00 /usr/java/jdk1.7.0_60/bin/java -cp /home/hadoop/lib/hadoop-lzo.jar:/home/hadoop/share/hadoop/common/*:/home/hadoop/share/hadoop/common/lib/*:/home/hadoop/share/hadoop/yarn/*:/home/hadoop/share/hadoop/yarn/lib/*:/home/hadoop/spark/lib/*:/home/hadoop/hive/lib/*:/home/hadoop/hbase/lib/*:/home/hadoop/spark/conf/:/home/hadoop/spark/jars/*:/home/hadoop/conf/ -Xmx1g -XX:MaxPermSize=256m org.apache.spark.deploy.history.HistoryServer

    配置文件说明:

    [beeswax]
    hive_server_host=uhadoop-bwgkeu-master2 # 连接hive的服务器
    hive_server_port=10000    # hiveserver2的端口
    hive_conf_dir=/home/hadoop/hive/conf    # 配置文件
    server_conn_timeout=1200    # 超时实际

    主机映射关系

    # cat /etc/hosts
    
    10.19.128.248 uhadoop-bwgkeu-master1
    10.19.196.141 uhadoop-bwgkeu-core4
    10.19.80.123 uhadoop-bwgkeu-core2
    10.19.185.160 uhadoop-bwgkeu-core1
    10.19.91.236 uhadoop-bwgkeu-core3
    10.19.72.208 uhadoop-bwgkeu-master2

    简单的hive查询验证是否能够正常查询:
    SELECT s07.description, s07.salary, s08.salary,
    s08.salary - s07.salary
    FROM
    sample_07 s07 JOIN sample_08 s08
    ON ( s07.code = s08.code)
    WHERE
    s07.salary < s08.salary
    ORDER BY s08.salary-s07.salary DESC
    LIMIT 1000

    简单hive查询的例子
    SELECT s07.description, s07.salary
    FROM
    sample_07 s07
    LIMIT 10

    hive_shell查询:

    # hive
    hive> select * from tbl_push_user_req limit 3;
  • 相关阅读:
    [poj解题]1017
    [算法] aov图拓扑算法
    【supervisord】部署单进程服务的利器
    【python】一个备份把文件备份到邮箱的python实现
    【GO】关于GO的浅显总结
    iOS开发快捷键
    iOS开发笔记
    VS2012智能提示无效解决方案
    国内几个WindowPhone广告平台
    纪念自己的第四个App:秘密Secret
  • 原文地址:https://www.cnblogs.com/reblue520/p/7255620.html
Copyright © 2020-2023  润新知