环境:CentOS Linux, java version "13.0.2"
1、下载和解压 Zeppelin
wget -bc https://mirrors.tuna.tsinghua.edu.cn/apache/zeppelin/zeppelin-0.9.0-preview1/zeppelin-0.9.0-preview1-bin-all.tgz
注释:参数 b 是后台运行,参数 c 是断点续传
tar -xvzf zeppelin-0.9.0-preview1-bin-all.tgz
2、conf 目录
cp zeppelin-site.xml.template zeppelin-site.xml
3、如果是远程登录 Zeppelin ,需修改默认 IP (默认是 127.0.0.1 ,修改为 0.0.0.0)
<property> <name>zeppelin.server.addr</name> <value>0.0.0.0</value> <description>Server binding address</description> </property>
4、启动
bin/zeppelin-daemon.sh start
5、访问(使用腾讯云 Linux 需要放开端口8080)
http://ip:8080/
6、下载和解压 Flink
tar -xzvf flink-1.10.0-bin-scala_2.11.tgz
7、复制 flink-python_2.11-1.10.0.jar
cd flink-1.10.0
cp opt/flink-python_2.11-1.10.0.jar lib
8、连接 hive
1)启动 hadoop-3.2.1
hdfs --daemon start namenode
hdfs --daemon start secondarynamenode
hdfs --daemon start datanode
yarn --daemon start resourcemanager
yarn --daemon start nodemanager
mapred --daemon start historyserver
2)启动 hive metastore apache-hive-3.1.2
nohup hive --service metastore 1> /opt/hive/current/metastore.log 2>/opt/hive/current/metastor_err.log &
3)复制 jar 包
cp flink-connector-hive_2.11-1.10.0.jar /app/flink-1.10.0/lib
cp hive-exec-3.1.2.jar /app/flink-1.10.0/lib
4)启动 zeppelin ,报错。
解决问题原则:NoSuchMethodException/NoSuchMethodError 一般是jar的版本不对,ClassNotFound是缺jar包(Jeff Zhang)
信息如下:
Caused by: org.apache.flink.table.catalog.exceptions.CatalogException: Failed to create Hive Metastore client at org.apache.flink.table.catalog.hive.client.HiveShimV230.getHiveMetastoreClient(HiveShimV230.java:52) at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:240) at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:71) at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:35) at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:188) at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:102) at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:235) at org.apache.zeppelin.flink.FlinkScalaInterpreter.open(FlinkScalaInterpreter.scala:427) at org.apache.zeppelin.flink.FlinkInterpreter.open(FlinkInterpreter.java:53) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70) ... 13 more Caused by: java.lang.NoSuchMethodException: org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(org.apache.hadoop.hive.conf.HiveConf, boolean) at java.lang.Class.getMethod(Class.java:1786) at org.apache.flink.table.catalog.hive.client.HiveShimV230.getHiveMetastoreClient(HiveShimV230.java:48) ... 22 more
org.apache.flink.table.catalog.hive.client.HiveShimV230.getHiveMetastoreClient(HiveShimV230.java:48)
其中,HiveShimV230 表示 hive 版本是 2.3,在 zeppelin 配置页设置 zeppelin.flink.hive.version 为 3.1.2
重启 zeppelin, 出错信息如下:
ClassNotFound com/facebook/fb303/FacebookService$Iface
搜索 com/facebook/fb303/FacebookService$Iface 的 jar 包是 libfb303.jar
下载 libfb303.jar 到 /app/flink-1.10.0/lib
重启 zeppelin, 出错信息如下:
NoSuchMethodError hadoop.hive.metastore.api.ThriftHiveMetastore$Client.sendBase
分析:未引入 libfb303.jar 时提示 ClassNotFound 缺少 jar 包,引入 libfb303.jar 时 提示 NoSuchMethodError jar 包版本不对,说明 libfb303.jar 版本不对。
删除旧版本 jar 包,下载最新版 libfb303.jar 到/app/flink-1.10.0/lib
https://repo1.maven.org/maven2/org/apache/thrift/libfb303/0.9.3/libfb303-0.9.3.jar
重启 zeppelin,正常。
总结:
1)zeppelin 连接 hive 3.1.2 需要的 jar 包列表:
flink-connector-hive_2.11-1.10.0.jar hive-exec-3.1.2.jar libfb303-0.9.3.jar
2)解决问题方法论:
NoSuchMethodException/NoSuchMethodError 一般是 jar 的版本不对,ClassNotFound是缺 jar 包。
https://www.journaldev.com/14538/java-lang-nosuchmethoderror (参考内容)
9、安装pip,Python
wget -c https://repo.anaconda.com/archive/Anaconda3-2020.02-Linux-x86_64.sh
10、安装docker
sudo curl -L https://github.com/docker/compose/releases/download/1.21.2/docker-compose-`uname -s`-`uname -m` -o /usr/local/bin/docker-compose
完成需退出当前shell
11、连接kafka (docker)
/etc/hosts 增加:127.0.0.1 broker
否则连接不上
非常感谢社区 Jeff Zhang