hadoop在CentOS下的安装配置
- 版本:centos-6.8-x86_64-minimal,hadoop2.6.4,jdk1.7.0
- 首先把jdk、hadoop压缩包下载发送到centos下并解压
下载发送不多赘述,解压命令tar -zxvf 压缩包
mv 原文件名 新文件名
(注意空格)
- 先配置jdk
- 进入jdk目录 pwd 查看当前目录复制备用/apps/jdk1.7.0_25
- 配置环境变量
vi ~/.bash_profile
java_home=/apps/jdk1.7.0_25
path=$path:$home/bin:$java_home/bin
source ~/.bash_profile
- 关闭防火墙,做好ssh免密登录
- 关闭防火墙
- service iptables stop
- chkconfig iptables off
- ssh-keygen -t rsa
- ssh免密登录
- master: cat /root/.ssh/id_rsa.pub >>/root/.ssh/authorized_keys
- master:scp /root/.ssh/authorized_keys @slave1:/root/.ssh/authorized_keys
- slave1:cat /root/.ssh/id_rsa.pub >>/root/.ssh/authorized_keys
- slave1:scp /root/.ssh/authorized_keys @slave2:/root/.ssh/authorized_keys
- slave2:cat /root/.ssh/id_rsa.pub >>/root/.ssh/authorized_keys
- slave2:scp /root/.ssh/authorized_keys @master:/root/.ssh/authorized_keys
- master:scp /root/.ssh/authorized_keys @slave1:/root/.ssh/authorized_keys
- slave2:scp /root/.ssh/authorized_keys @slave2:/root/.ssh/authorized_keys
- 关闭防火墙
- 再配置hadoop
- 环境变量vi ~/.bash_profile
hadoop_home=/apps/hadoop-2.6.4
path=$path:$home/bin:$java_home/bin:$hadoop_home/bin:$hadoop_home/sbinsource ~/.bash_profile
- 配置运行环境
- core-site.xml
-
<property>
<name>fs.defaultfs</name>
<value>hdfs://master:9000</value>
</property>
-
- hdfs-site.xml
-
<property>
<name>dfs.replication</name>
<value>3</value>
</property><property>
<name>dfs.namenode.name.dir</name>
<value>/app/hadoop/dfs/name</value>
</property><property>
<name>dfs.datanode.data.dir</name>
<value>/app/hadoop/dfs/data</value>
</property><property>
<name>dfs.secondary.http.address</name>
<value>slave2:50090</value>
</property><property>
<name>dfs.namenode.checkpoint.dir</name>
<value>/app/hadoop/dfs/namesecondary</value>
</property>
-
- hadoop-env.sh
- export java_home=/apps/jdk1.7.0_25
- yarn-site.xml
-
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property><property>
<name>yarn.resourcemanager.hostname</name>
<value>master</value>
</property>
mapreduce-site.xml
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property> - slaves
- 将原有内容替换为从节点主机名
-
- core-site.xml
- 将配置完成的jdk、hadoop发送到另外两台主机上
- master:scp -r apps/ @slave1:/apps/
- master:scp -r apps/ @slave2:/apps/
- master:scp ~/.bash_profile @slave1:~/.bash_profile
- master:scp ~/.bash_profile @slave2:~/.bash_profile
- slave1: source ~/.bash_profile
- slave2: source ~/.bash_profile
hadoop集群配置完成
格式化namenode
hadoop namenode -format
启动所有节点
start-all.sh
推荐阅读
-
在 CentOS 7 中安装 MySQL 8 的教程详解
-
Windows下PHP安装路径配置错误导致Apache无法启动的解决方法
-
python库lxml在linux和WIN系统下的安装
-
win10系统下Anaconda+TensorFlow+Pycharm的下载安装与环境配置
-
腾讯云下的CentOS7 安装最新版Python3.7.0
-
Linux Centos7.2下安装nodejs&npm配置全局路径的教程
-
php 在windows下配置虚拟目录的方法介绍
-
在Linux下配置和安装Domino 服务器的问题
-
CentOS安装配置MySQL8.0的步骤详解
-
CentOS下Python3的安装及创建虚拟环境的方法