在ambari集成环境下,
ambari启动kylin时报A couple of hive jars can't be found: , please check jar files in current HCAT_HOME or export HCAT_HOME='YOUR_LOCAL_HCAT_HOME'错误,如下:
stderr:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/KYLIN/package/scripts/kylin_master.py", line 75, in <module>
KylinMaster().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/KYLIN/package/scripts/kylin_master.py", line 54, in start
Execute(cmd, user='hdfs')
File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
returns=self.resource.returns)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '. /var/lib/ambari-agent/tmp/kylin_env.rc;/usr/hdp/3.1.0.0-78/kylin/bin/kylin.sh start;cp -rf /usr/hdp/3.1.0.0-78/kylin/pid /var/run/kylin/kylin.pid' returned 1. Retrieving hadoop conf dir...
...................................................[[32mPASS[0m]
KYLIN_HOME is set to /usr/hdp/3.1.0.0-78/kylin
Checking HBase
...................................................[[32mPASS[0m]
Checking hive
...................................................[[32mPASS[0m]
Checking hadoop shell
...................................................[[32mPASS[0m]
Checking hdfs working dir
...................................................[[32mPASS[0m]
Retrieving Spark dependency...
...................................................[[32mPASS[0m]
Retrieving Flink dependency...
[33mOptional dependency flink not found, if you need this; set FLINK_HOME, or run bin/download-flink.sh[0m
...................................................[[32mPASS[0m]
Retrieving kafka dependency...
Couldn't find kafka home. If you want to enable streaming processing, Please set KAFKA_HOME to the path which contains kafka dependencies.
...................................................[[32mPASS[0m]
[33mChecking environment finished successfully. To check again, run 'bin/check-env.sh' manually.[0m
Retrieving hive dependency...
A couple of hive jars can't be found: , please check jar files in current HCAT_HOME or export HCAT_HOME='YOUR_LOCAL_HCAT_HOME'
cp: cannot stat ‘/usr/hdp/3.1.0.0-78/kylin/pid’: No such file or directory
stdout:
- 解决方法:
vim /etc/profile
export HIVE_HOME=/usr/hdp/current/hive-server2
export HCAT_HOME=/usr/hdp/current/hive-webhcat
source /etc/profile