启动HIVE服务报错HWIWARfilenotfound
[root@idc01-vm-test-124 bin]# ./hive --service hwi 14/04/20 11:22:31 INFO hwi.HWIServer: HWI is starting up 14/04/20 11:22:31 WARN hwi.HWIServer: hive.hwi.listen.port was not specified defaulting to 9999 14/04/20 11:22:31 FATAL hwi.HWIServ
[root@idc01-vm-test-124 bin]# ./hive --service hwi
14/04/20 11:22:31 INFO hwi.HWIServer: HWI is starting up
14/04/20 11:22:31 WARN hwi.HWIServer: hive.hwi.listen.port was not specified defaulting to 9999
14/04/20 11:22:31 FATAL hwi.HWIServer: HWI WAR file not found at /usr/local/hadoop/hive-0.7.1-cdh3u6/lib/hive-hwi-0.7.1-cdh3u6.war
解决方法:
进入 conf
vim hive-site.xml
添加以下配置项
hive.hwi.war.file lib/hive-hwi-0.7.1-cdh3u6.war This sets the path to the HWI war file, relative to ${HIVE_HOME}. hive.hwi.listen.host 0.0.0.0 This is the host address the Hive Web Interface will listen on hive.hwi.listen.port 9999 This is the port the Hive Web Interface will listen on
需要注意:配置hive.hwi.war.file的值 根据lib/下得hive-hwi-0.7.1-cdh3u6.war 对应。
接下来 启动下服务
./hive --service hwi
14/04/20 11:30:07 INFO hwi.HWIServer: HWI is starting up
14/04/20 11:30:07 INFO mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
14/04/20 11:30:07 INFO mortbay.log: jetty-6.1.14
14/04/20 11:30:07 INFO mortbay.log: Extract jar:file:/usr/local/hadoop/hive-0.7.1-cdh3u6/lib/hive-hwi-0.7.1-cdh3u6.war!/ to /tmp/Jetty_0_0_0_0_9999_hive.hwi.0.7.1.cdh3u6.war__hwi__co5uvf/webapp
14/04/20 11:30:07 INFO mortbay.log: Started SocketConnector@0.0.0.0:9999
DONE.
推荐阅读
-
windows无法启动MySQL服务报错1067的解决方法
-
Linux服务启动报错日志分析
-
Ambari启动hive报错怎么办
-
Windows AD 报错解决:UAC File Virtualization服务启动失败 此驱动程序被阻止加载
-
hive启动hiveservice2报错User: kate is not allowed to impersonate kate (state=08S01,code=0)
-
windows下安装mysql报错:"MYSQL 服务无法启动"的3534问题
-
启动docker服务的时候报错:Failed to start docker.service: Unit docker.service is masked.
-
Linux启动某个服务报错:Failed to start xxx.service: Unit not found
-
登录PL/SQL Developer登录本地数据库报没有监听程序,查看服务发现Oracle监听服务没有启动,右击启动监听程序后报错如何解决?
-
启动hive时报错Accessdeniedforuser'root'@'hadoop01'(usingpassword:YES)