欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页

【hive】beeline错误:User: root is not allowed to impersonate root (state=,code=0)

程序员文章站 2022-07-15 07:54:14
...
Error: Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate root (state=,code=0)

在(注意不是hive-site.xml !!!),而是

hadoop/etc/hadoop/core-site.xml

里加上下面的配置文件,重启hdfs,其中“xxx”是连接beeline的用户,将“xxx”替换成自己的用户名root即可

不要复制这个哟,这个需要改动
<property>
    <name>hadoop.proxyuser.xxx.hosts</name>
    <value>*</value>
</property>
<property>
    <name>hadoop.proxyuser.xxx.groups</name>
    <value>*</value>
</property>

即:

<property>
    <name>hadoop.proxyuser.root.hosts</name>
    <value>*</value>
</property>
<property>
    <name>hadoop.proxyuser.root.groups</name>
    <value>*</value>
</property>

问题解决后将网上的资料总结一下:

主要原因是hadoop引入了一个安全伪装机制,使得hadoop 不允许上层系统直接将实际用户传递到hadoop层,而是将实际用户传递给一个超级代理,由此代理在hadoop上执行操作,避免任意客户端随意操作hadoop

hiveserver2网页ui端口:10002

【hive】beeline错误:User: root is not allowed to impersonate root (state=,code=0)

成功使用beeline使用hive

[root@hadoop3 ~]# beeline
Beeline version 1.6.3 by Apache Hive
beeline> !connect jdbc:hive2://hadoop3:10000
Connecting to jdbc:hive2://hadoop3:10000
Enter username for jdbc:hive2://hadoop3:10000: root
Enter password for jdbc:hive2://hadoop3:10000: ******
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/programs/spark-1.6.3-bin-hadoop2.6/lib/spark-assembly-1.6.3-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/programs/hadoop-2.8.5/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
19/12/10 04:39:51 INFO jdbc.Utils: Supplied authorities: hadoop3:10000
19/12/10 04:39:51 INFO jdbc.Utils: Resolved authority: hadoop3:10000
19/12/10 04:39:51 INFO jdbc.HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://hadoop3:10000
Connected to: Apache Hive (version 2.1.0)
Driver: Spark Project Core (version 1.6.3)
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://hadoop3:10000> show databases;
+----------------+--+
| database_name  |
+----------------+--+
| default        |
| test01         |
+----------------+--+
2 rows selected (1.823 seconds)
0: jdbc:hive2://hadoop3:10000> use default;
No rows affected (0.087 seconds)
0: jdbc:hive2://hadoop3:10000> select * from links limit 50;
+----------------+---------------+---------------+--+
| links.movieid  | links.imdbid  | links.tmdbid  |
+----------------+---------------+---------------+--+
+----------------+---------------+---------------+--+
No rows selected (1.427 seconds)
0: jdbc:hive2://hadoop3:10000> 

相关标签: hive