Hive2.3.6环境搭建笔记
Hive2.3.6环境搭建笔记
需在Hadoop环境基础之上安装Hive
1. 解压apache-hive-2.3.6-bin.tar.gz
解压并移动到指定目录
tar zxvf apache-hive-2.3.6-bin.tar.gz
mv apache-hive-2.3.6-bin /usr/local/hadoop/
准备一个mysqlDriver放到hive的lib目录下mv mysql-connector-java-5.1.22.jar /usr/local/hadoop/apache-hive-2.3.6-bin/lib/
2. 修改环境变量
配置环境变量
vim ~/.bash_profile
export HIVE_HOME=/usr/local/hadoop/apache-hive-2.3.6-bin
export PATH=$HIVE_HOME/bin:$PATH
添加完成之后更新source ~/.bash_profile
输入hive --version验证是否成功
3. 修改hive配置
进入hive配置文件目录
cd /usr/local/hadoop/apache-hive-2.3.6-bin/conf/
3.1 修改hive-env.sh
复制hive-env.sh.template模板 并修改
cp hive-env.sh.template hive-env.sh
vim hive-env.sh
添加JAVA_HOME和HADOOP_HOMEexport JAVA_HOME=/var/java/jdk1.8.0_11
export HADOOP_HOME=/usr/local/hadoop/hadoop-2.8.5
3.1 修改hive-site.xml
找到javax.jdo.option.ConnectionURL修改数据库连接
找到javax.jdo.option.ConnectionPassword 修改密码
找到javax.jdo.option.ConnectionUserName 修改用户
找到javax.jdo.option.ConnectionDriverName 修改Driver
找到hive.metastore.warehouse.dir 修改warehouse在hdfs上的地址
找到hive.exec.scratchdir 修改地址
找到hive.querylog.location 修改地址
如果后续启动报java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: xxxxxxx 请参考:参考
<configuration>
<!-- 注意xml的&需要转译 -->
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://39.99.184.255:3306/hive?characterEncoding=UTF-8&createDatabaseIfNotExist=true</value>
<description>the URL of the MySQL database</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
<description>Driver class name for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>username</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>pwd</value>
</property>
<property>
<name>hive.metastore.warehouse.dir</name>
<value>/hive/warehouse</value>
</property>
<property>
<name>hive.exec.scratchdir</name>
<value>/hive/tmp</value>
</property>
<property>
<name>hive.querylog.location</name>
<value>/hive/log</value>
</property>
<property>
<name>hive.metastore.schema.verification</name>
<value>false</value>
</property>
<property>
<name>beeline.hs2.connection.user</name>
<value>root</value>
</property>
<property>
<name>beeline.hs2.connection.password</name>
<value>root</value>
</property>
<property>
<name>beeline.hs2.connection.hosts</name>
<value>localhost:10000</value>
</property>
</configuration>
4. 在HDFS上创建相关目录
hdfs dfs -mkdir /hive
hdfs dfs -mkdir /hive/tmp
hdfs dfs -mkdir /hive/log
hdfs dfs -mkdir /hive/warehouse
hdfs dfs -chmod -R 999/hive
5.启动Hive
进入bin目录 初始化元数据信息
/usr/local/hadoop/apache-hive-2.3.6-bin/bin
./schematool --dbType mysql --initSchema
mysql已初始化完成
启动metastorehive --service metastore &
启动hiveServer2hive --service hiveserver2 &
下一篇: Spark yarn集群搭建