欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页

hive 安装 bug汇总 hivehadoop 

程序员文章站 2024-03-17 14:00:22
...
一  Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
1  hive配置文件的问题
   初学者只要使用数据库连接的属性就可以啦
<property>
  <name>javax.jdo.option.ConnectionURL</name>
  <value>jdbc:mysql://127.0.0.1:3306/metastore_db?createDatabaseIfNotExist=true</value>
  <description>JDBC connect string for a JDBC metastore</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionDriverName</name>
  <value>com.mysql.jdbc.Driver</value>
  <description>Driver class name for a JDBC metastore</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionUserName</name>
  <value>hive</value>
  <description>username to use against metastore database</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionPassword</name>
  <value>a123</value>
  <description>password to use against metastore database</description>
</property>

  参考:http://bo-hai.iteye.com/blog/1880930
2 连接不上数据库
   这里有多种原因,我列举几条
   (1) 连接数据库的username和password不正确。
   (2)usernam的权限不够
   (3)端口号写错数据库名(我写了数据库名)


FAILED: Error in metadata: javax.jdo.JDODataStoreException: Error(s) were found while auto-creating/validating the datastore for classes. The errors are printed in the log, and are attached to this exception.
NestedThrowables:
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Specified key was too long; max key length is 767 bytes
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask

解决的方法是到mysql中的hive数据库里执行 alter database hive character set latin1;改变hive元数据库的字符集,问题就可以解决!

参考:http://www.cnblogs.com/Blueren/archive/2011/06/29/Sir_001.html


show tables; 正常
创建表,抛出异常

Cannot create directory /user/hive/warehouse/test. Name node is in safe mode.

google找到

your NN is not coming out of safemode. it should do that automatically after a few seconds. use this command to come out of it manually :
bin/hadoop dfsadmin -safemode leave
then retry with your command.

使用   bin/hadoop dfsadmin -safemode leave  就可以解决
相关标签: hive hadoop