欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页

centos ssh 无密码登陆配置

程序员文章站 2022-06-04 20:02:59
...

在第一次配置hadoop集群的时候,遇到ssh不能无密码登陆问题

(如果不想看详细,可以看最后总结:授权文件 authorized_keys 和.ssh目录除owner外都不允许有w权限 

 

1、服务器三台, 系统均为centos: TT1, TT2, TT3

每台服务器都创建了hadoop账户, 

/usr/sbin/useradd hadoop   #添加用户
passwd hadoop       #修改密码
  进入/home/hadoop/.ssh/,生成秘钥,执行命令:
ssh-keygen -t rsa
cat  id_rsa.pub >>authorized_keys
 
2、把TT1上的authorized_keys复制到TT3上authorized_keys_from_tt1
scp authorized_keys hadoop@TT3:/home/hadoop/.ssh/

 并执行

cat  authorized_keys_from_tt1>>authorized_keys

 

3、在TT1上执行

ssh hadoop@TT3  的时候每次都需要输密码 

[hadoop@TT1 .ssh]$ ssh hadoop@TT3
hadoop@TT3's password: 
Last login: Wed May 28 17:11:29 2014 from TT3
[hadoop@TT3 ~]$ exit
logout
 
Connection to TT3 closed.
[hadoop@TT1 .ssh]$ 
[hadoop@TT1 .ssh]$ 
[hadoop@TT1 .ssh]$ ssh hadoop@TT3
The authenticity of host 'TT3 (192.168.174.223)' can't be established.
RSA key fingerprint is c3:c9:dd:59:35:e2:7c:27:18:ab:ca:6b:6f:79:a3:84.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added 'TT3,192.168.174.223' (RSA) to the list of known hosts.
hadoop@TT3's password: 
Last login: Wed May 28 17:24:39 2014 from 192.168.174.120
[hadoop@TT3 ~]$ exit
logout
 
 但是登陆TT2时第一次需要,后续就不需要;

比较后发现TT2上存在dsa,猜测可能是需要TT3上没有dsa

 

4、在TT3上执行 

ssh-keygen -t dsa
cat id_dsa.pub >>authorized_keys
[hadoop@TT3 .ssh]$ ssh-keygen -t dsa
Generating public/private dsa key pair.
Enter file in which to save the key (/home/hadoop/.ssh/id_dsa): 
Enter passphrase (empty for no passphrase): 
Enter same passphrase again: 
Your identification has been saved in /home/hadoop/.ssh/id_dsa.
Your public key has been saved in /home/hadoop/.ssh/id_dsa.pub.
The key fingerprint is:
3a:20:9b:91:97:93:41:d8:50:76:e5:30:0a:3d:dd:0c hadoop@TT3
[hadoop@TT3 .ssh]$ 
[hadoop@TT3 .ssh]$ cat id_dsa.pub >> authorized_keys
[hadoop@TT3 .ssh]$ 
  

5、在在TT1上执行 

[hadoop@TT1 .ssh]$ ssh hadoop@TT3
Last login: Wed May 28 17:25:08 2014 from 192.168.174.221
[hadoop@TT3 ~]$ 
 第一次需要,后续就不需要,配置成功;

 

更新,了解了一下这两个的区别,引用别人的描述可能更清楚一点, 参考链接:

http://blog.sina.com.cn/s/blog_6f31085901015agu.html

http://www.51know.info/system_base/openssl.html

 

如果能看一下标准可能会更清楚

DSA = Digital Signature Algorithm. based on discrete logarithms computation.
DES = Digital Encryption Standard. Obsolete standard.

 

 ---------------------------------------------------错误的分割线---------------------------------------------------------

 

上次以为是dsa的原因,查了dsa和rsa的区别之后,发现不应该是这个原因,所以把三台服务里的rsa和dsa秘钥全部删除,重新来一遍,同时也看到别人有提到是权限的问题,这次特意测试了一下,

还是全部使用rsa, 毕竟支持比较广泛;

步骤:

1、同样在第三台机器上添加秘钥;

 

2、在第二台机器上添加第一台的rsa之后,设置了权限

[hadoop@Test-02 .ssh]$ ll
total 16
-rw-rw-r-- 1 hadoop hadoop  800 Jun  3 15:18 authorized_keys
-rw-rw-r-- 1 hadoop hadoop  400 Jun  3 15:16 authorized_keys_221
-rw------- 1 hadoop hadoop 1675 Jun  3 15:15 id_rsa
-rw-r--r-- 1 hadoop hadoop  400 Jun  3 15:15 id_rsa.pub
[hadoop@TeamTest-02 .ssh]$ chmod 600 authorized_keys
[hadoop@TeamTest-02 .ssh]$ ll
total 16
-rw------- 1 hadoop hadoop  800 Jun  3 15:18 authorized_keys
-rw-rw-r-- 1 hadoop hadoop  400 Jun  3 15:16 authorized_keys_221
-rw------- 1 hadoop hadoop 1675 Jun  3 15:15 id_rsa
-rw-r--r-- 1 hadoop hadoop  400 Jun  3 15:15 id_rsa.pub
[hadoop@TeamTest-02 .ssh]$ 

 从第一台ssh登陆,一次性成功;

 

3、第三台添加了第一台的rsa之后,不做权限设置,直接从第一台ssh登陆,每次都要密码

[hadoop@Test-01 .ssh]$ ssh hadoop@192.168.30.223
hadoop@192.168.30.223's password: 
Last login: Tue Jun  3 15:23:08 2014 from 192.168.30.222
[hadoop@Test-03 ~]$ exit
logout

 然后和第二台机器一样设置权限

[hadoop@Test-03 .ssh]$ ll
total 16
-rw-rw-r-- 1 hadoop hadoop  800 Jun  3 15:22 authorized_keys
-rw-rw-r-- 1 hadoop hadoop  400 Jun  3 15:19 authorized_keys_221
-rw------- 1 hadoop hadoop 1675 Jun  3 15:19 id_rsa
-rw-r--r-- 1 hadoop hadoop  400 Jun  3 15:19 id_rsa.pub
[hadoop@Test-03 .ssh]$ 
[hadoop@Test-03 .ssh]$ 
[hadoop@Test-03 .ssh]$ chmod 600 authorized_keys
[hadoop@Test-03 .ssh]$ ll
total 16
-rw------- 1 hadoop hadoop  800 Jun  3 15:22 authorized_keys
-rw-rw-r-- 1 hadoop hadoop  400 Jun  3 15:19 authorized_keys_221
-rw------- 1 hadoop hadoop 1675 Jun  3 15:19 id_rsa
-rw-r--r-- 1 hadoop hadoop  400 Jun  3 15:19 id_rsa.pub
[hadoop@Test-03 .ssh]$ 

 效果出现了,从第一台直接登陆,不再需要密码了,因为前面一次已经输入过密码

[hadoop@Test-01 .ssh]$ ssh hadoop@192.168.30.223
Last login: Tue Jun  3 15:24:14 2014 from 192.168.30.221
[hadoop@Test-03 ~]$ exit
logout

 同时也验证了我的想法,两者都可以用。

 

另外也可以使用ssh -v hadoop@192.168.30.223 进行测试连接

[hadoop@Test-01 ~]$ ssh -v localhost
OpenSSH_4.3p2, OpenSSL 0.9.8e-fips-rhel5 01 Jul 2008
debug1: Reading configuration data /etc/ssh/ssh_config
debug1: Applying options for *
debug1: Connecting to localhost [127.0.0.1] port 22.
debug1: Connection established.
debug1: identity file /home/hadoop/.ssh/identity type -1
debug1: identity file /home/hadoop/.ssh/id_rsa type 1
debug1: identity file /home/hadoop/.ssh/id_dsa type -1
debug1: loaded 3 keys
debug1: Remote protocol version 2.0, remote software version OpenSSH_4.3
debug1: match: OpenSSH_4.3 pat OpenSSH*
debug1: Enabling compatibility mode for protocol 2.0
debug1: Local version string SSH-2.0-OpenSSH_4.3
debug1: SSH2_MSG_KEXINIT sent
debug1: SSH2_MSG_KEXINIT received
debug1: kex: server->client aes128-ctr hmac-md5 none
debug1: kex: client->server aes128-ctr hmac-md5 none
debug1: SSH2_MSG_KEX_DH_GEX_REQUEST(1024<1024<8192) sent
debug1: expecting SSH2_MSG_KEX_DH_GEX_GROUP
debug1: SSH2_MSG_KEX_DH_GEX_INIT sent
debug1: expecting SSH2_MSG_KEX_DH_GEX_REPLY
debug1: Host 'localhost' is known and matches the RSA host key.
debug1: Found key in /home/hadoop/.ssh/known_hosts:1
debug1: ssh_rsa_verify: signature correct
debug1: SSH2_MSG_NEWKEYS sent
debug1: expecting SSH2_MSG_NEWKEYS
debug1: SSH2_MSG_NEWKEYS received
debug1: SSH2_MSG_SERVICE_REQUEST sent
debug1: SSH2_MSG_SERVICE_ACCEPT received
debug1: Authentications that can continue: publickey,gssapi-with-mic,password
debug1: Next authentication method: gssapi-with-mic
debug1: Unspecified GSS failure.  Minor code may provide more information
No credentials cache found

debug1: Unspecified GSS failure.  Minor code may provide more information
No credentials cache found

debug1: Unspecified GSS failure.  Minor code may provide more information
No credentials cache found

debug1: Next authentication method: publickey
debug1: Trying private key: /home/hadoop/.ssh/identity
debug1: Offering public key: /home/hadoop/.ssh/id_rsa
debug1: Authentications that can continue: publickey,gssapi-with-mic,password
debug1: Trying private key: /home/hadoop/.ssh/id_dsa
debug1: Next authentication method: password
hadoop@localhost's password: 
debug1: Authentication succeeded (password).
debug1: channel 0: new [client-session]
debug1: Entering interactive session.
debug1: Sending environment.
debug1: Sending env LANG = en_US.UTF-8
Last login: Tue Jun  3 15:20:33 2014 from teamtest-01

[hadoop@TeamTest-01 ~]$ exit
logout

debug1: client_input_channel_req: channel 0 rtype exit-status reply 0
debug1: channel 0: free: client-session, nchannels 1
Connection to localhost closed.
debug1: Transferred: stdin 0, stdout 0, stderr 33 bytes in 2.4 seconds
debug1: Bytes per second: stdin 0.0, stdout 0.0, stderr 14.0
debug1: Exit status 0
[hadoop@Test-01 ~]$ ll

后续再了解ssh如何进行调试

 

总结:ssh不输密码远程登录

1、使用 ssh-keygen -t rsa 添加秘钥

2、将客户端的公钥 id_rsa.pub 或者添加到 authorized_keys 复制到要远程登录的账户的home目录的.ssh目录下,没有.ssh目录则创建 (e.g.  /home/hadoop/.ssh)

3、添加客户端的公钥授权到服务器端的 authorized_keys 中

4、设置权限 chmod 600 authorized_keys

5、客户端尝试无密码登陆 ssh hadoop@192.168.30.222

6、如果有问题,尝试 ssh -v IP 进行测试,遇到问题到网上查一下原因

 ---------------------------------------------------rwx 的分割线---------------------------------------------------------

蛋疼的考虑权限600,可能会有400,那具体到什么情况是需要输入密码呢,继续测试

 

[hadoop@Test-02 .ssh]$ ll
total 20
-rwx------ 1 hadoop hadoop  800 Jun  3 15:18 authorized_keys
-rw-rw-r-- 1 hadoop hadoop  400 Jun  3 15:16 authorized_keys_221
-rw------- 1 hadoop hadoop 1675 Jun  3 15:15 id_rsa
-rw-r--r-- 1 hadoop hadoop  400 Jun  3 15:15 id_rsa.pub
-rw-r--r-- 1 hadoop hadoop  396 Jun  3 15:20 known_hosts
[hadoop@Test-02 .ssh]$ chmod 740  authorized_keys 
[hadoop@Test-02 .ssh]$ ll
total 20
-rwxr----- 1 hadoop hadoop  800 Jun  3 15:18 authorized_keys
-rw-rw-r-- 1 hadoop hadoop  400 Jun  3 15:16 authorized_keys_221
-rw------- 1 hadoop hadoop 1675 Jun  3 15:15 id_rsa
-rw-r--r-- 1 hadoop hadoop  400 Jun  3 15:15 id_rsa.pub
-rw-r--r-- 1 hadoop hadoop  396 Jun  3 15:20 known_hosts
[hadoop@Test-02 .ssh]$ chmod 760  authorized_keys 
[hadoop@Test-02 .ssh]$ ll
total 20
-rwxrw---- 1 hadoop hadoop  800 Jun  3 15:18 authorized_keys
-rw-rw-r-- 1 hadoop hadoop  400 Jun  3 15:16 authorized_keys_221
-rw------- 1 hadoop hadoop 1675 Jun  3 15:15 id_rsa
-rw-r--r-- 1 hadoop hadoop  400 Jun  3 15:15 id_rsa.pub
-rw-r--r-- 1 hadoop hadoop  396 Jun  3 15:20 known_hosts
[hadoop@Test-02 .ssh]$ chmod 704  authorized_keys  
[hadoop@Test-02 .ssh]$ ll
total 20
-rwx---r-- 1 hadoop hadoop  800 Jun  3 15:18 authorized_keys
-rw-rw-r-- 1 hadoop hadoop  400 Jun  3 15:16 authorized_keys_221
-rw------- 1 hadoop hadoop 1675 Jun  3 15:15 id_rsa
-rw-r--r-- 1 hadoop hadoop  400 Jun  3 15:15 id_rsa.pub
-rw-r--r-- 1 hadoop hadoop  396 Jun  3 15:20 known_hosts
[hadoop@Test-02 .ssh]$ chmod 706  authorized_keys 
[hadoop@Test-02 .ssh]$ ll
total 20
-rwx---rw- 1 hadoop hadoop  800 Jun  3 15:18 authorized_keys
-rw-rw-r-- 1 hadoop hadoop  400 Jun  3 15:16 authorized_keys_221
-rw------- 1 hadoop hadoop 1675 Jun  3 15:15 id_rsa
-rw-r--r-- 1 hadoop hadoop  400 Jun  3 15:15 id_rsa.pub
-rw-r--r-- 1 hadoop hadoop  396 Jun  3 15:20 known_hosts
[hadoop@Test-02 .ssh]$ ll
total 20
-rwxr-xr-x 1 hadoop hadoop  800 Jun  3 15:18 authorized_keys
-rw-rw-r-- 1 hadoop hadoop  400 Jun  3 15:16 authorized_keys_221
-rw------- 1 hadoop hadoop 1675 Jun  3 15:15 id_rsa
-rw-r--r-- 1 hadoop hadoop  400 Jun  3 15:15 id_rsa.pub
-rw-r--r-- 1 hadoop hadoop  396 Jun  3 15:20 known_hosts
[hadoop@Test-02 .ssh]$ 
 发现只要是组g和其他o存在权限w的情况下就需要输入密码,其余的000, ...,700, ..., 755都不需要输入密码,为什么呢,那就dig一下吧....

 ---------------------------------------------------dig的分割线---------------------------------------------------------

 

[hadoop@Test-01 ~]$ ssh -v 192.168.30.222
 太长了,就不贴了,看一下差异
debug1: Next authentication method: publickey
debug1: Trying private key: /home/hadoop/.ssh/identity
debug1: Offering public key: /home/hadoop/.ssh/id_rsa
debug1: Server accepts key: pkalg ssh-rsa blen 277
debug1: read PEM private key done: type RSA
debug1: Authentication succeeded (publickey).
debug1: channel 0: new [client-session]
debug1: Entering interactive session.
debug1: Sending environment.
debug1: Sending env LANG = en_US.UTF-8
---------------------------------------------------------------------------------------
debug1: Next authentication method: publickey
debug1: Trying private key: /home/hadoop/.ssh/identity
debug1: Offering public key: /home/hadoop/.ssh/id_rsa
debug1: Authentications that can continue: publickey,gssapi-with-mic,password
debug1: Trying private key: /home/hadoop/.ssh/id_dsa
debug1: Next authentication method: password
hadoop@192.168.30.222's password: 
debug1: Authentication succeeded (password).
debug1: channel 0: new [client-session]
debug1: Entering interactive session.
debug1: Sending environment.
debug1: Sending env LANG = en_US.UTF-8
 看一下,一个直接是公钥,一个还要密码,还要继续dig啊...
debug1: Authentication succeeded (publickey).
debug1: Next authentication method: password
既然是权限的问题,那就可以看日志了吧
[root@Test-02 ~]# tail -f /var/log/secure
Jun  6 18:33:10 Test-02 sshd[646]: pam_unix(sshd:session): session closed for user hadoop
Jun  6 18:33:11 Test-02 sshd[731]: Authentication refused: bad ownership or modes for file /home/hadoop/.ssh/authorized_keys
 看到了 
Jun  6 18:27:46 Test-02 sshd[731]: Authentication refused: bad ownership or modes for file /home/hadoop/.ssh/authorized_keys
 那这个又怎么解释呢?查别人的描述,只解决问题,没有原因;好吧,官方文档呢,那里去了,网站找起来太累,还是直接man ssh吧,找到了这么一段
~/.ssh/authorized_keys
             Lists the public keys (RSA/DSA) that can be used for logging in as this user.  The format of this file is described in the sshd(8) manual
             page.  This file is not highly sensitive, but the recommended permissions are read/write for the user, and not accessible by others.
 not accessible by others那也不是最终的方式,不是很清楚,那就看sshd(8) manual , 继续吧...
 捡来的一段描述
SSH doesn’t like it if your home or ~/.ssh directories have group write permissions. 
Your home directory should be writable only by you, ~/.ssh should be 700, and authorized_keys should be 600

You can also get around this by adding StrictModes off to your ssh_config file, 
but I’d advise against it - fixing permissions is the way to go.
 还是看代码比较清楚;那就下代码吧
 找到代码auth.c的383~387行,是不是就是这个原因?
	if ((stp->st_uid != 0 && stp->st_uid != uid) ||
	    (stp->st_mode & 022) != 0) {
		snprintf(err, errlen, "bad ownership or modes for file %s",
		    buf);
		return -1;
	}
 代码啊,什么意思呢?
  ---------------------------------------------------ssh code 的分割线---------------------------------------------------------
stp->st_uid != 0 && stp->st_uid != uid
     用户存在,并且不是自己
(stp->st_mode & 022) != 0
     模式与十进制022进行与运算,按照下面规则转换:
二进制:   1   1   1   1  1  1  1  1
十进制: 128  64  32  16  8  4  2  1
    就是【00010110】了,获取的权限模式和此二进制进行与运算,获取的的结果不是0
      总体:用户不是自己,或者,该文件的权限模式和022的二进制【00010110】与运算的结果不是0,就报日志里面的错误。
   ---------------------------------------------------linux rwx 的分割线---------------------------------------------------------
ssh讲的有点晕,还是使用rwx的方式描述比较好,如果你对liux的权限清楚的话
linux的权限对应的值:   r=4, w=2, x=1
上面的代码不需要转化为二进制
(stp->st_mode & 022) != 0
Linux系统的权限
[hadoop@Test-02 .ssh]$ ll
total 20
-rwxr--r-- 1 hadoop hadoop  800 Jun  3 15:18 authorized_keys
-rw-rw-r-- 1 hadoop hadoop  400 Jun  3 15:16 authorized_keys_221
-rw------- 1 hadoop hadoop 1675 Jun  3 15:15 id_rsa
-rw-r--r-- 1 hadoop hadoop  400 Jun  3 15:15 id_rsa.pub
-rw-r--r-- 1 hadoop hadoop  396 Jun  3 15:20 known_hosts
[hadoop@Test-02 .ssh]$ 
对应022的权限就是【-----w--w-】 就是组和other有写的权限
[hadoop@Test-02 .ssh]$ chmod 022 authorized_keys
[hadoop@Test-02 .ssh]$ ll
total 20
-----w--w- 1 hadoop hadoop  800 Jun  3 15:18 authorized_keys
-rw-rw-r-- 1 hadoop hadoop  400 Jun  3 15:16 authorized_keys_221
-rw------- 1 hadoop hadoop 1675 Jun  3 15:15 id_rsa
-rw-r--r-- 1 hadoop hadoop  400 Jun  3 15:15 id_rsa.pub
-rw-r--r-- 1 hadoop hadoop  396 Jun  3 15:20 known_hosts
[hadoop@Test-02 .ssh]$
 这个时候获取的文件权限与022进行与计算,不等于0就报错
其实就是这个文件不能设置为【组和other有写的权限】,其他都可以,就是前面在
rwx 的分割线出进行的测试的结果。
就是和【-----w--w-】进行的与运算,如果组和other的权限里面存在w的话,那这个肯定不等0,结果就不允许。
 最新检查,因为代码有两处这个地方,其实是授权文件 authorized_keys 和.ssh目录都不允许有w权限
  ---------------------------------------------------over 的分割线---------------------------------------------------------
我成唠叨王了....