python3.6.5基于kerberos认证的hive和hdfs连接调用方式
1. kerberos是一种计算机网络授权协议,用来在非安全网络中,对个人通信以安全的手段进行身份认证。具体请查阅官网
2. 需要安装的包(基于centos)
yum install libsasl2-dev yum install gcc-c++ python-devel.x86_64 cyrus-sasl-devel.x86_64 yum install python-devel yum install krb5-devel yum install python-krbv pip install krbcontext==0.9 pip install thrift==0.9.3 pip install thrift-sasl==0.2.1 pip install impyla==0.14.1 pip install hdfs[kerberos] pip install pykerberos==1.2.1
3. /etc/krb5.conf 配置, 在这个文件里配置你服务器所在的域
4./etc/hosts 配置, 配置集群机器和域所在机器
5. 通过kinit 生成 ccache_file或者keytab_file
6. 连接hive代码如下
import os from impala.dbapi import connect from krbcontext import krbcontext keytab_path = os.path.split(os.path.realpath(__file__))[0] + '/xxx.keytab' principal = 'xxx' with krbcontext(using_keytab=true,principal=principal,keytab_file=keytab_path): conn = connect(host=ip, port=10000, auth_mechanism='gssapi', kerberos_service_name='hive') cursor = conn.cursor() cursor.execute('select * from default.books') for row in cursor: print(row)
7. 连接hdfs代码如下
from hdfs.ext.kerberos import kerberosclient from krbcontext import krbcontext hdfs_url = 'http://' + host + ':' + port data = self._get_keytab(sso_ticket) self._save_keytab(data) with krbcontext(using_keytab=true, keytab_file=self.keytab_file, principal=self.user): self.client = kerberosclient(hdfs_url) self.client._list_status(path).json()['filestatuses']['filestatus'] #获取path下文件及文件夹
8. 注:krbcontext这个包官方说支持python2,但是python3也能用
这个hdfs_url 一定要带"http://"不然会报错
9. 我新增了一些配置文件配置,具体的操作如下
python3.6.5基于kerberos认证的hdfs,hive连接调用(含基础环境配置)
1需要准备的环境
yum包(需要先装yum包,再装python包,不然会有问题)
yum install openldap-clients -y yum install krb5-workstation krb5-libs -y yum install gcc-c++ python-devel.x86_64 cyrus-sasl-devel.x86_64 yum install python-devel yum install krb5-devel yum install python-krbv yum install cyrus-sasl-plain cyrus-sasl-devel cyrus-sasl-gssapi
python包安装(pip或pip3,请根据实际情况选择)
pip install krbcontext==0.9 pip install thrift==0.9.3 pip install thrift-sasl==0.2.1 pip install impyla==0.14.1 pip install hdfs[kerberos] pip install pykerberos==1.2.1
配置/etc/hosts文件(需要把大数据平台的机器和域名进行配置)
10.xxx.xxx.xxx name-1 panel.test.com
10.xxx.xxx.xxx name-1
配置/etc/krb5.conf(具体查看kerberos服务配置中心)
参考配置(仅供参考,具体更具自己实际配置修改)
[libdefaults] renew_lifetime = 9d forwardable = true default_realm = panel.com ticket_lifetime = 24h dns_lookup_realm = false dns_lookup_kdc = false default_ccache_name = /tmp/krb5cc_%{uid} [logging] default = file:/var/log/krb5kdc.log admin_server = file:/var/log/kadmind1.log kdc = file:/var/log/krb5kdc1.log [realms] panel.com = { admin_server = panel.test1.com kdc = panel.test1.com }
连接代码:
hdfs:
import json, os from hdfs.ext.kerberos import kerberosclient from krbcontext import krbcontext def _connect(self, host, port, sso_ticket=none): try: hdfs_url = 'http://' + host + ':' + port active_str = 'kinit -kt {0} {1}'.format(self.keytab_file, self.user) # 激活当前kerberos用户认证,因为python缓存机制,切换用户,这个缓存不会自动切换,需要手动处理下 os.system(active_str) with krbcontext(using_keytab=true, keytab_file=self.keytab_file, principal=self.user): self.client = kerberosclient(hdfs_url) except exception as e: raise e
hive
import os from krbcontext import krbcontext from impala.dbapi import connect from auto_model_platform.settings import config def _connect(self, host, port, sso_ticket=none): try: active_str = 'kinit -kt {0} {1}'.format(self.keytab_file, self.user) # 同hdfs os.system(active_str) with krbcontext(using_keytab=true, principal=self.user, keytab_file=self.keytab_file): self.conn = connect(host=host, port=port, auth_mechanism='gssapi', kerberos_service_name='hive') self.cursor = self.conn.cursor() except exception as e: raise e
总结
我在做的时候也遇到很多坑,其实在这个需要理解其中原理,比如kerberos的机制和对应命令
如果是做基础平台用,用多用户切换的情况,建议不要用python,因为一点都不友好,官方包问题很多,我都改用java的jdbc去操作hdfs和hive了
如果只是自己测试和和做算法研究,还是可以用的,因为这个代码简单,容易实现
补充
kinit命令
kinit -kt xxxx.keytab #激活xxxx用户当前缓存
kinit list #查看当前缓存用户
以上这篇python3.6.5基于kerberos认证的hive和hdfs连接调用方式就是小编分享给大家的全部内容了,希望能给大家一个参考,也希望大家多多支持。
上一篇: 产妇火龙果能吃吗,有什么好处呢
下一篇: 雪里蕻炖豆腐怎么做