• (转)hive建表注释中文乱码问题(desc/show create table中文乱码问题)


    原文:https://blog.csdn.net/qq_45124566/article/details/121724876

    Hive建表注释中文乱码问题(desc/show create table)
    hive建表注释中文乱码问题

    问题描述:

    create table test(
    id int comment '测试id',
    name string comment '测试姓名'
    )
    comment '测试用表';
    1
    2
    3
    4
    5
    使用上述建表语句,建表完成后,使用desc test/desc formatted test或者show create table test,看见的中文解释均为乱码。

    问题原因:

    针对 MySQL 字符集的问题:

    普通情况下咱们的 mysql 默认编码是 latin1,但是我们在日常开发中大多数情况下需要用到 utf-8 编码,如果是默认 latin1 的话,咱们的中文存储进去容易乱码,所以说大家在遇到一些数据乱码的情况话,最好把 MySQL 的编码改成utf-8。

    但是在这里要非常严重强调的一点:hive 的元数据 metastore 在 MySQL 的数据库,不管是数据库本身,还是里面的表编码都必须是 latin1(CHARACTER SET latin1 COLLATE latin1_bin)!!!!不然会有报错。

    验证方式:(可以通过客户端软件在数据库上右键属性查看,也可以通过命令查看)

    mysql> show create database metastore; --此处metastore是自己所用hive在mysql中的元数据库;
    1
    +-----------+-----------------------------------------------------------------------+
    | Database | Create Database |
    +-----------+-----------------------------------------------------------------------+
    | metastore | CREATE DATABASE `metastore` /*!40100 DEFAULT CHARACTER SET latin1 |
    | | COLLATE latin1_bin */ |
    +-----------+-----------------------------------------------------------------------+
    1
    2
    3
    4
    5
    6
    针对 MySQL 字符集的问题的解决:

    针对MySQL 本身的字符集问题
    查看 MySQL 字符集

    登录mysql

    [root@hadoop102 ~]$ mysql -u root -p
    1
    查询 mysql 字符集

    mysql> show variables like 'chara%';
    +--------------------------+----------------------------+
    | Variable_name | Value |
    +--------------------------+----------------------------+
    | character_set_client | utf8 |
    | character_set_connection | utf8 |
    | character_set_database | latin1 |
    | character_set_filesystem | binary |
    | character_set_results | utf8 |
    | character_set_server | latin1 |
    | character_set_system | utf8 |
    | character_sets_dir | /usr/share/mysql/charsets/ |
    +--------------------------+----------------------------+
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    将 character_set_server 字符集设置为 UTF-8。

    通过修改配置文件设置 MySQL 字符集为 UTF-8,将 my-default.cnf 配置文件复制到 /etc 下的 my.cnf 文件中

    [root@hadoop102 ~]$ cp /usr/share/mysql/my-default.cnf /etc/my.cnf
    1
    进入 etc目录下打开 my.cnf 文件 ,对 my.cnf 进行修改,修改内容如下。

    [root@hadoop102 ~]$ vim /etc/my.cnf
    ##在[mysqld]上面加入下面两句话
    [client]
    default-character-set=utf8

    ##在[mysqld]最下面加入下面几句话
    [mysqld]
    default-storage-engine=INNODB
    character-set-server=utf8
    collation-server=utf8_general_ci
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    修改完成之后进行保存,然后重启 MySQL 服务

    [root@hadoop102 mysql]$ service mysql restart
    Shutting down MySQL.......... [确定]
    Starting MySQL.... [确定]
    [root@hadoop102 mysql]$ service mysql status
    MySQL running (25434) [确定]
    1
    2
    3
    4
    5
    再次登录 MySQL ,查看字符集是否修改

    [root@hadoop102 ~]$ mysql -u root -p
    1
    再次查询 MySQL 字符集

    mysql> show variables like 'chara%';
    ...
    +--------------------------+----------------------------+
    | Variable_name | Value |
    +--------------------------+----------------------------+
    | character_set_client | utf8 |
    | character_set_connection | utf8 |
    | character_set_database | utf8 |
    | character_set_filesystem | binary |
    | character_set_results | utf8 |
    | character_set_server | utf8 |
    | character_set_system | utf8 |
    | character_sets_dir | /usr/share/mysql/charsets/ |
    +--------------------------+----------------------------+
    8 rows in set (0.02 sec)
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    或者

    mysql> \s
    ...
    Server characterset: utf8
    Db characterset: utf8
    Client characterset: utf8
    Conn. characterset: utf8
    ...
    1
    2
    3
    4
    5
    6
    7
    mysql> show variables like "colla%";
    +----------------------+-----------------+
    | Variable_name | Value |
    +----------------------+-----------------+
    | collation_connection | utf8_general_ci |
    | collation_database | utf8_general_ci |
    | collation_server | utf8_general_ci |
    +----------------------+-----------------+
    1
    2
    3
    4
    5
    6
    7
    8
    这样MySQL的编码就修改成功了!

    针对Hive元数据库metastore中的表,分区,视图的编码设置

    因为我们知道 metastore 支持数据库级别,表级别的字符集是 latin1,那么我们只需要把相应注释的地方的字符集由 latin1 改成 UTF-8,就可以了。用到注释的就三个地方,表、分区、视图。如下修改分为两个步骤:

    进入数据库 metastore 中执行以下 5 条 SQL 语句

    -- 修改表字段注解和表注解
    alter table COLUMNS_V2 modify column COMMENT varchar(256) character set utf8;
    alter table TABLE_PARAMS modify column PARAM_VALUE varchar(4000) character set utf8;
    -- 修改分区字段注解:
    alter table PARTITION_PARAMS modify column PARAM_VALUE varchar(4000) character set utf8;
    alter table PARTITION_KEYS modify column PKEY_COMMENT varchar(4000) character set utf8;
    -- 修改索引注解:
    alter table INDEX_PARAMS modify column PARAM_VALUE varchar(4000) character set utf8;
    1
    2
    3
    4
    5
    6
    7
    8
    这样针对 MySQL 的问题我们就解决完成了。

    然后我们登录 Hive,再次使用 desc test/desc formatted test 就可以看到中文了,不会乱码,但是我们使用 show create table test 还是会乱码,这个是 Hive 本身的问题,我们需要编译 Hive 的源码来解决。

    针对Hive 字符集的问题:

    desc table_name 没问题,show create table table_name 还是显示乱码,这是一个Hive社区bug。另外,社区对中文的支持比较弱,尽量避免在SQL语句中写中文,实际数据有中文没关系。

    针对Hive 字符集的问题的解决:

    Hive源码编译

    下载Maven:https://archive.apache.org/dist/maven/maven-3/3.5.3/binaries/

    Maven版本3.5.3

    Hive源码下载:http://archive.apache.org/dist/hive/hive-1.2.1/

    Hive版本1.2.1

    上传两个下载好的包

    解压Maven

    [root@hadoop102 software]$ tar -zxvf apache-maven-3.5.3-bin.tar.gz -C /opt/module/
    1
    改名(可改可不改,看自己)

    [root@hadoop102 software]$ cd /opt/module/
    [root@hadoop102 module]$ mv apache-maven-3.5.3/ maven-3.5.3/
    1
    2
    配置环境变量

    [root@hadoop102 module]$ vim /etc/profile

    export JAVA_HOME=/opt/module/jdk1.8.0_144
    export PATH=$PATH:$JAVA_HOME/bin

    export HADOOP_HOME=/opt/module/hadoop-2.7.2
    export PATH=$PATH:$HADOOP_HOME/bin
    export PATH=$PATH:$HADOOP_HOME/sbin

    export ZOO_HOME=/opt/module/zookeeper-3.4.10
    export PATH=$PATH:$ZOO_HOME/bin

    export HIVE_HOME=/opt/module/hive-1.2.1
    export PATH=$PATH:$HIVE_HOME/bin

    export MAVEN_HOME=/opt/module/maven-3.5.3
    export PATH=$PATH:$MAVEN_HOME/bin

    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    source 使配置文件生效

    [root@hadoop102 module]$ source /etc/profile
    1
    验证maven是否配置成功

    [root@hadoop102 module]$ mvn -version
    Apache Maven 3.5.3 (3383c37e1f9e9b3bc3df5050c29c8aff9f295297; 2018-02-25T03:49:05+08:00)
    Maven home: /opt/module/maven-3.5.3
    Java version: 1.8.0_144, vendor: Oracle Corporation
    Java home: /opt/module/jdk1.8.0_144/jre
    Default locale: zh_CN, platform encoding: UTF-8
    OS name: "linux", version: "2.6.32-642.el6.x86_64", arch: "amd64", family: "unix"
    1
    2
    3
    4
    5
    6
    7
    替换maven镜像,可提高下载速度

    <mirrors>
    <!-- mirror
    | Specifies a repository mirror site to use instead of a given repository. The repository that
    | this mirror serves has an ID that matches the mirrorOf element of this mirror. IDs are used
    | for inheritance and direct lookup purposes, and must be unique across the set of mirrors.
    |
    <mirror>
    <id>mirrorId</id>
    <mirrorOf>repositoryId</mirrorOf>
    <name>Human Readable Name for this Mirror.</name>
    <url>http://my.repository.com/repo/path</url>
    </mirror>
    -->

    <mirror>
    <id>alimaven</id>
    <name>aliyun maven</name>
    <url>http://maven.aliyun.com/nexus/content/groups/public/</url>
    <mirrorOf>central</mirrorOf>
    </mirror>
    </mirrors>
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    上传源码包apache-hive-1.2.1-src.tar.gz 至 /usr/local/workspace (位置随意,自己清楚就行)

    解压

    [root@hadoop102 workspace]$ cd /usr/local/workspace
    [root@hadoop102 workspace]$ tar -zxvf apache-hive-1.2.1-src
    1
    2
    进入到解压后的文件根目录下

    [root@hadoop102 workspace]$ cd apache-hive-1.2.1-src

    [root@hadoop102 apache-hive-1.2.1-src]$ cd ql/src/java/org/apache/hadoop/hive/ql/exec/

    [root@hadoop102 exec]$ vim DDLTask.java
    1
    2
    3
    4
    5
    COMMENT 后面的东西变成了乱码了。。。是我MySQL中元数据字符集不对吗?其实不是,罪魁祸首是Hive内部的代码没有处理好中文。我们可以修改ql/src/java/org/apache/hadoop/hive/ql/exec/DDLTask.java类中的两句代码:

    outStream.writeBytes(createTab_stmt.toString());
    1
    改成

    outStream.write(createTab_stmt.toString().getBytes("UTF-8"));
    1
    同时将

    outStream.writeBytes(createTab_stmt.render());
    1
    改成

    outStream.write(createTab_stmt.render().getBytes("UTF-8"));
    1
    进入Hive源码根目录下(/usr/local/workspace/apache-hive-1.2.1-src,此处是我自己的目录,根据自己的实际目录来)编译Hive源码

    执行编译命令:

    [root@hadoop102 apache-hive-1.2.1-src]$ mvn clean package -Phadoop-2 -DskipTests
    1
    等待一段时间,出现以下内容表示编译成功:

    [INFO] ------------------------------------------------------------------------
    [INFO] Reactor Summary:
    [INFO]
    [INFO] Hive 1.2.1 ......................................... SUCCESS [ 21.971 s]
    [INFO] Hive Shims Common .................................. SUCCESS [ 6.247 s]
    [INFO] Hive Shims 0.20S ................................... SUCCESS [ 1.812 s]
    [INFO] Hive Shims 0.23 .................................... SUCCESS [ 22.994 s]
    [INFO] Hive Shims Scheduler ............................... SUCCESS [ 1.804 s]
    [INFO] Hive Shims ......................................... SUCCESS [ 1.639 s]
    [INFO] Hive Common ........................................ SUCCESS [ 7.101 s]
    [INFO] Hive Serde ......................................... SUCCESS [01:09 min]
    [INFO] Hive Metastore ..................................... SUCCESS [03:55 min]
    [INFO] Hive Ant Utilities ................................. SUCCESS [ 15.772 s]
    [INFO] Spark Remote Client ................................ SUCCESS [14:57 min]
    [INFO] Hive Query Language ................................ SUCCESS [08:13 min]
    [INFO] Hive Service ....................................... SUCCESS [02:19 min]
    [INFO] Hive Accumulo Handler .............................. SUCCESS [32:07 min]
    [INFO] Hive JDBC .......................................... SUCCESS [ 17.203 s]
    [INFO] Hive Beeline ....................................... SUCCESS [ 40.589 s]
    [INFO] Hive CLI ........................................... SUCCESS [ 41.483 s]
    [INFO] Hive Contrib ....................................... SUCCESS [ 1.790 s]
    [INFO] Hive HBase Handler ................................. SUCCESS [03:35 min]
    [INFO] Hive HCatalog ...................................... SUCCESS [01:23 min]
    [INFO] Hive HCatalog Core ................................. SUCCESS [ 21.449 s]
    [INFO] Hive HCatalog Pig Adapter .......................... SUCCESS [ 41.616 s]
    [INFO] Hive HCatalog Server Extensions .................... SUCCESS [02:49 min]
    [INFO] Hive HCatalog Webhcat Java Client .................. SUCCESS [ 2.582 s]
    [INFO] Hive HCatalog Webhcat .............................. SUCCESS [02:51 min]
    [INFO] Hive HCatalog Streaming ............................ SUCCESS [ 5.170 s]
    [INFO] Hive HWI ........................................... SUCCESS [ 1.551 s]
    [INFO] Hive ODBC .......................................... SUCCESS [ 0.900 s]
    [INFO] Hive Shims Aggregator .............................. SUCCESS [ 0.133 s]
    [INFO] Hive TestUtils ..................................... SUCCESS [ 0.512 s]
    [INFO] Hive Packaging 1.2.1 ............................... SUCCESS [ 3.053 s]
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD SUCCESS
    [INFO] ------------------------------------------------------------------------
    [INFO] Total time: 01:17 h
    [INFO] Finished at: 2021-12-05T00:19:56+08:00
    [INFO] ------------------------------------------------------------------------
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    编译时长根据你网络环境,最好配置成公司的Maven仓库(默认在~/.m2)。如果你需要将编译好的文件进行打包,可以运行下面的命令:

    执行打包命令

    [root@hadoop102 apache-hive-1.2.1-src]$ mvn clean package -Phadoop-2 -DskipTests -Pdist
    1
    出现如下命令,证明打包编译成功

    [INFO] ------------------------------------------------------------------------
    [INFO] Reactor Summary:
    [INFO]
    [INFO] Hive 1.2.1 ......................................... SUCCESS [ 8.287 s]
    [INFO] Hive Shims Common .................................. SUCCESS [ 10.172 s]
    [INFO] Hive Shims 0.20S ................................... SUCCESS [ 4.348 s]
    [INFO] Hive Shims 0.23 .................................... SUCCESS [ 8.410 s]
    [INFO] Hive Shims Scheduler ............................... SUCCESS [ 2.085 s]
    [INFO] Hive Shims ......................................... SUCCESS [ 1.981 s]
    [INFO] Hive Common ........................................ SUCCESS [ 17.277 s]
    [INFO] Hive Serde ......................................... SUCCESS [ 8.610 s]
    [INFO] Hive Metastore ..................................... SUCCESS [ 26.505 s]
    [INFO] Hive Ant Utilities ................................. SUCCESS [ 0.793 s]
    [INFO] Spark Remote Client ................................ SUCCESS [ 10.517 s]
    [INFO] Hive Query Language ................................ SUCCESS [03:14 min]
    [INFO] Hive Service ....................................... SUCCESS [ 11.423 s]
    [INFO] Hive Accumulo Handler .............................. SUCCESS [ 7.155 s]
    [INFO] Hive JDBC .......................................... SUCCESS [01:16 min]
    [INFO] Hive Beeline ....................................... SUCCESS [ 3.501 s]
    [INFO] Hive CLI ........................................... SUCCESS [ 2.547 s]
    [INFO] Hive Contrib ....................................... SUCCESS [ 2.442 s]
    [INFO] Hive HBase Handler ................................. SUCCESS [ 10.001 s]
    [INFO] Hive HCatalog ...................................... SUCCESS [ 1.465 s]
    [INFO] Hive HCatalog Core ................................. SUCCESS [ 5.774 s]
    [INFO] Hive HCatalog Pig Adapter .......................... SUCCESS [ 3.933 s]
    [INFO] Hive HCatalog Server Extensions .................... SUCCESS [ 4.044 s]
    [INFO] Hive HCatalog Webhcat Java Client .................. SUCCESS [ 3.594 s]
    [INFO] Hive HCatalog Webhcat .............................. SUCCESS [ 19.550 s]
    [INFO] Hive HCatalog Streaming ............................ SUCCESS [ 2.780 s]
    [INFO] Hive HWI ........................................... SUCCESS [ 2.134 s]
    [INFO] Hive ODBC .......................................... SUCCESS [ 1.827 s]
    [INFO] Hive Shims Aggregator .............................. SUCCESS [ 0.209 s]
    [INFO] Hive TestUtils ..................................... SUCCESS [ 0.823 s]
    [INFO] Hive Packaging 1.2.1 ............................... SUCCESS [03:00 min]
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD SUCCESS
    [INFO] ------------------------------------------------------------------------
    [INFO] Total time: 10:34 min
    [INFO] Finished at: 2021-12-05T00:45:50+08:00
    [INFO] ------------------------------------------------------------------------
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    完成后可以去"/usr/local/workspace/apache-hive-1.2.1-src/packaging/target/"目录下找到"apache-hive-1.2.1-bin"编译好的hive文件,在此文件下找到并在此文件的lib目录下找到"hive-exec-1.2.1.jar"包替换自己原来Hive的lib目录下的"hive-exec-1.2.1.jar"包。

    [root@hadoop102 lib]$ cp /usr/local/workspace/apache-hive-1.2.1-src/packaging/target/apache-hive-1.2.1-bin/apache-hive-1.2.1-bin/lib/hive-exec-1.2.1.jar /opt/module/hive-1.2.1/lib/
    1
    注意:此处我们用的是root用户编译的 Hive 源码,要是之前用的是 Hive 是其他用户,记得要修改复制后的"hive-exec-1.2.1.jar"包的权限。

    然后重新启动Hive客户端,然后执行 show create table table_name 就能正常显示中文了!!!

    编译Hive出现的错误:

    替换Maven镜像的位置

    错误:

    [INFO] Hive 1.2.1 ......................................... FAILURE [ 01:02 h]
    [INFO] Hive Shims Common .................................. SKIPPED
    [INFO] Hive Shims 0.20S ................................... SKIPPED
    [INFO] Hive Shims 0.23 .................................... SKIPPED
    [INFO] Hive Shims Scheduler ............................... SKIPPED
    [INFO] Hive Shims ......................................... SKIPPED
    [INFO] Hive Common ........................................ SKIPPED
    [INFO] Hive Serde ......................................... SKIPPED
    ...

    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD FAILURE
    [INFO] ------------------------------------------------------------------------
    [INFO] Total time: 01:02 h
    [INFO] Finished at: 2021-12-02T22:31:14+08:00
    [INFO] ------------------------------------------------------------------------
    [ERROR] Failed to execute goal org.apache.maven.plugins:maven-site-plugin:3.3:attach-descriptor (attach-descriptor) on project hive: Execution attach-descriptor of goal org.apache.maven.plugins:maven-site-plugin:3.3:attach-descriptor failed: Plugin org.apache.maven.plugins:maven-site-plugin:3.3 or one of its dependencies could not be resolved: The following artifacts could not be resolved: org.apache.maven.reporting:maven-reporting-exec:jar:1.1, org.apache.maven:maven-artifact:jar:3.0, org.apache.maven.shared:maven-shared-utils:jar:0.3, com.google.code.findbugs:jsr305:jar:2.0.1, org.apache.struts:struts-taglib:jar:1.3.8: Could not transfer artifact org.apache.maven.reporting:maven-reporting-exec:jar:1.1 from/to central (https://repo.maven.apache.org/maven2): Read timed out -> [Help 1]
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    连接超时了,替换Maven镜像

    <mirrors>
    <!-- mirror
    | Specifies a repository mirror site to use instead of a given repository. The repository that
    | this mirror serves has an ID that matches the mirrorOf element of this mirror. IDs are used
    | for inheritance and direct lookup purposes, and must be unique across the set of mirrors.
    |
    <mirror>
    <id>mirrorId</id>
    <mirrorOf>repositoryId</mirrorOf>
    <name>Human Readable Name for this Mirror.</name>
    <url>http://my.repository.com/repo/path</url>
    </mirror>
    -->

    <mirror>
    <id>alimaven</id>
    <name>aliyun maven</name>
    <url>http://maven.aliyun.com/nexus/content/groups/public/</url>
    <mirrorOf>central</mirrorOf>
    </mirror>
    </mirrors>
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    下面这个错误是在执行"mvn install -DskipTests -Dcheckstyle.skip=true"这个命令时出现的

    这个错误出现的原因说实话我没搞懂,希望有知道的小伙伴可以告知一下,哈哈哈,先在这里感谢了。

    我是怎么解决这个问题的呢?说来奇怪,我就换了一个命令然后就没出现这个问题了,很纳闷。

    换的命令是"mvn clean package -Phadoop-2 -DskipTests"

    [INFO] ------------------------------------------------------------------------
    [INFO] Reactor Summary:
    [INFO]
    [INFO] Hive 1.2.1 ......................................... SUCCESS [ 10.851 s]
    [INFO] Hive Shims Common .................................. SUCCESS [ 5.831 s]
    [INFO] Hive Shims 0.20S ................................... SUCCESS [ 2.751 s]
    [INFO] Hive Shims 0.23 .................................... SUCCESS [ 8.996 s]
    [INFO] Hive Shims Scheduler ............................... SUCCESS [ 4.800 s]
    [INFO] Hive Shims ......................................... SUCCESS [ 2.806 s]
    [INFO] Hive Common ........................................ FAILURE [ 12.649 s]
    [INFO] Hive Serde ......................................... SKIPPED
    [INFO] Hive Metastore ..................................... SKIPPED
    [INFO] Hive Ant Utilities ................................. SKIPPED
    [INFO] Spark Remote Client ................................ SKIPPED
    [INFO] Hive Query Language ................................ SKIPPED
    [INFO] Hive Service ....................................... SKIPPED
    [INFO] Hive Accumulo Handler .............................. SKIPPED
    [INFO] Hive JDBC .......................................... SKIPPED
    [INFO] Hive Beeline ....................................... SKIPPED
    [INFO] Hive CLI ........................................... SKIPPED
    [INFO] Hive Contrib ....................................... SKIPPED
    [INFO] Hive HBase Handler ................................. SKIPPED
    [INFO] Hive HCatalog ...................................... SKIPPED
    [INFO] Hive HCatalog Core ................................. SKIPPED
    [INFO] Hive HCatalog Pig Adapter .......................... SKIPPED
    [INFO] Hive HCatalog Server Extensions .................... SKIPPED
    [INFO] Hive HCatalog Webhcat Java Client .................. SKIPPED
    [INFO] Hive HCatalog Webhcat .............................. SKIPPED
    [INFO] Hive HCatalog Streaming ............................ SKIPPED
    [INFO] Hive HWI ........................................... SKIPPED
    [INFO] Hive ODBC .......................................... SKIPPED
    [INFO] Hive Shims Aggregator .............................. SKIPPED
    [INFO] Hive TestUtils ..................................... SKIPPED
    [INFO] Hive Packaging 1.2.1 ............................... SKIPPED
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD FAILURE
    [INFO] ------------------------------------------------------------------------
    [INFO] Total time: 51.038 s
    [INFO] Finished at: 2021-12-03T12:16:34+08:00
    [INFO] ------------------------------------------------------------------------
    [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-common: Compilation failure: Compilation failure:
    [ERROR] /usr/local/workspace/apache-hive-1.2.1-src/common/src/java/org/apache/hadoop/hive/common/jsonexplain/tez/Vertex.java:[26,28] 程序包org.codehaus.jackson不存在
    [ERROR] /usr/local/workspace/apache-hive-1.2.1-src/common/src/java/org/apache/hadoop/hive/common/jsonexplain/tez/Vertex.java:[27,32] 程序包org.codehaus.jackson.map不存在
    [ERROR] /usr/local/workspace/apache-hive-1.2.1-src/common/src/java/org/apache/hadoop/hive/common/jsonexplain/tez/Vertex.java:[83,53] 找不到符号
    [ERROR] 符号: 类 JsonParseException
    [ERROR] 位置: 类 org.apache.hadoop.hive.common.jsonexplain.tez.Vertex
    [ERROR] /usr/local/workspace/apache-hive-1.2.1-src/common/src/java/org/apache/hadoop/hive/common/jsonexplain/tez/Vertex.java:[83,73] 找不到符号
    [ERROR] 符号: 类 JsonMappingException
    ...
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    这个错误没什么影响,我就没改,有兴趣的可以自己弄一下

    [ERROR] ResourceManager : unable to find resource 'VM_global_library.vm' in any resource loader.

  • 相关阅读:
    MSVCRTD.lib(crtexe.obj) : error LNK2019: 无法解析的外部符号 _main,该符号在函数 ___tmainCRTStart
    (转)Spring MVC
    ios>android>javaee
    css标准导航代码
    图片对齐问题
    display:inline、block、inline-block的区别
    css怎么引用某张图片?链接要怎么写
    论怎么写好一篇实验报告
    路由器及其配置方法
    (转)MyEclipse2014配置Tomcat开发JavaWeb程序JSP以及Servlet
  • 原文地址:https://www.cnblogs.com/liujiacai/p/16786005.html
Copyright © 2020-2023  润新知