欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页

Centos7上安装hadoop伪集群

程序员文章站 2022-03-22 17:34:36
...

一、环境:

jdk home:        /opt/jdk1.8

user:                devops

hadoop home: /opt/hadoop-3.2.0  

hadoop version: 3.2.0

 

 二、安装配置:

 

$cd /opt
$wget http://mirror.bit.edu.cn/apache/hadoop/common/hadoop-3.2.0/hadoop-3.2.0.tar.gz
$tar -xzf ./hadoop-3.2.0.tar.gz
$cd hadoop-3.2.0
#需要修改成对应的 jdk home
$sed -i 's/#  JAVA_HOME=\/usr\/java\/testing.*/JAVA_HOME=\/opt\/jdk1.8/g'  ./etc/hadoop/hadoop-env.sh

$./bin/hadoop version

$vi etc/hadoop/core-site.xml
#输入如下内容:
 <property>
        <name>fs.defaultFS</name>
        <value>hdfs://localhost:9000</value>
 </property>
 
 
$vi etc/hadoop/hdfs-site.xml
#输入如下内容:
  <property>
        <name>dfs.replication</name>
        <value>1</value>
 </property>
 
$vi etc/hadoop/mapred-site.xml
#输入如下内容:
 <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
 </property>
 <property>
       <name>yarn.app.mapreduce.am.env</name>
       <value>HADOOP_MAPRED_HOME=/opt/hadoop-3.2.0</value>
 </property>
 <property>
       <name>mapreduce.map.env</name>
       <value>HADOOP_MAPRED_HOME=/opt/hadoop-3.2.0</value>
 </property>
 <property>
       <name>mapreduce.reduce.env</name>
       <value>HADOOP_MAPRED_HOME=/opt/hadoop-3.2.0</value>
 </property>
 
$vi etc/hadoop/yarn-site.xml
输入如下内容:
 <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
 </property>

 

 三、设置免密码登陆

 

 $ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
 $ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
 $cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
 $chmod 0600 ~/.ssh/authorized_keys
 $chmod 755 ~
    注设置完成后一定测试 ssh localhost 无需输入密码才继续往下可参考https://zhoupinheng.iteye.com/admin/blogs/2436265   

四、启动测试

    

 $bin/hdfs namenode -format
 $sbin/start-dfs.sh
 
 $sbin/start-yarn.sh
   
 $bin/hdfs dfs -mkdir /user
 $bin/hdfs dfs -mkdir /user/devops
 
 $bin/hdfs dfs -put etc/hadoop /user/devops/input
 
 $bin/hdfs dfs -rm -f -r /user/devops/input/shellprofile.d
 $bin/hdfs dfs -rm -f -r /user/devops/output
 #运行测试例子
 $bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-3.2.0.jar wordcount input output
 #查看运行结果
 $bin/hdfs dfs -cat output/*
  
 sbin/stop-yarn.sh
 sbin/stop-dfs.sh
 

 

相关标签: hadoop