sentry服务后,几个权限问题 博客分类: cloudera sentryhiveldap
程序员文章站
2024-03-18 18:50:16
...
以账户bi为例
问题一:账户bi beeline ldap后,对于外联表需要外联/user/bi目录下的数据。
解决:
根据sentry文档,需要给/user/bi授权uri ALL权限。
GRANT ALL ON URI 'hdfs://172.20.0.71:8020/user/bi' TO ROLE user_bi_all_role;解决之
问题二:账户bi运行mapreduce需要读取/user/hive/warehouse下的数据。
解决:
一般/user/hive/warehouse属于hive:hive,根据sentry要求,配置771权限。
为了让bi账户对该目录有访问权限,借助acl
hadoop fs -setfacl -R -m user:bi:r-x /user/hive/warehouse 让bi对该目录有读权限解决之
问题三:账户bi beeline ldap登陆后,对于hive表需要从/user/bi下load数据。即账户bi,需要将bi:bi /user/bi目录下的数据转移至hive:hive目录下。
hiveserver2异常:
解决:需要hive对/user/bi的读写权限,同样借助acl
hadoop fs -setfacl -R -m user:hive:rwx /user/bi 解决之
我在社区的提问,貌似没人回答https://community.cloudera.com/t5/Cloudera-Manager-Installation/with-sentry-and-ldap-hive-can-t-load-data-from-hdfs-or-local/m-p/25413
问题四:
使用udf时,参见http://www.cloudera.com/content/cloudera/en/documentation/core/v5-2-x/topics/cm_mc_hive_udf.html
具体步骤:
1.将jar包放在hive server2节点指定目录,例如/tmp/udf
2.在cloudera manger hive cofiguration中Hive Auxiliary JARs Directory设置如上录
3.重启hive并分发hive client配置(hive加载了jar包)
4.beeline hive超级用户下执行:
create role udf_all_role;
GRANT ALL ON URI 'file:///tmp/udf/hive-udf.jar' TO ROLE udf_all_role;
GRANT ROLE udf_all_role TO GROUP bi;
5.测试
使用bi账户beeline登陆,并执行 CREATE TEMPORARY FUNCTION nvl AS 'com.nexr.platform.hive.udf.GenericUDFNVL';即可。
问题五:
hue配置impala服务时,hue报错
No available Impalad to send queries to.
User 'hive' is not authorized to delegate to 'hue'.
hue中的beewax可正常使用,hive的库权限控制正常使用,但在hue配置中,选择基于impala service时,报如上错误。
解决:
[desktop]
ldap_username=hue
ldap_password=111111
配置hue账号,不能使用hive或者其他admin账号。
问题一:账户bi beeline ldap后,对于外联表需要外联/user/bi目录下的数据。
解决:
根据sentry文档,需要给/user/bi授权uri ALL权限。
GRANT ALL ON URI 'hdfs://172.20.0.71:8020/user/bi' TO ROLE user_bi_all_role;解决之
问题二:账户bi运行mapreduce需要读取/user/hive/warehouse下的数据。
解决:
一般/user/hive/warehouse属于hive:hive,根据sentry要求,配置771权限。
为了让bi账户对该目录有访问权限,借助acl
hadoop fs -setfacl -R -m user:bi:r-x /user/hive/warehouse 让bi对该目录有读权限解决之
问题三:账户bi beeline ldap登陆后,对于hive表需要从/user/bi下load数据。即账户bi,需要将bi:bi /user/bi目录下的数据转移至hive:hive目录下。
hiveserver2异常:
Failed with exception Unable to move sourcehdfs://myhost:8020/user/bi/aa.txt to destination hdfs://myhost:8020/user/hive/warehouse/bi_system.db/qiu/aa.txt org.apache.hadoop.hive.ql.metadata.HiveException: Unable to move sourcehdfs://myhost:8020/user/bi/aa.txt to destination hdfs://myhost:8020/user/hive/warehouse/bi_system.db/qiu/aa.txt at org.apache.hadoop.hive.ql.metadata.Hive.renameFile(Hive.java:2269) at org.apache.hadoop.hive.ql.metadata.Hive.replaceFiles(Hive.java:2405) at org.apache.hadoop.hive.ql.metadata.Table.replaceFiles(Table.java:673) at org.apache.hadoop.hive.ql.metadata.Hive.loadTable(Hive.java:1490) at org.apache.hadoop.hive.ql.exec.MoveTask.execute(MoveTask.java:275) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1516) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1283) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1101) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:924) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:919) at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:145) at org.apache.hive.service.cli.operation.SQLOperation.access$000(SQLOperation.java:69) at org.apache.hive.service.cli.operation.SQLOperation$1$1.run(SQLOperation.java:200) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614) at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:502) at org.apache.hive.service.cli.operation.SQLOperation$1.run(SQLOperation.java:213) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=hive, access=WRITE, inode="/user/bi":bi:bi:drwxr-xr-x at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:255) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:236) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:214) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:148) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6250) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameToInternal(FSNamesystem.java:3608) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameToInt(FSNamesystem.java:3578) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameTo(FSNamesystem.java:3542) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.rename(NameNodeRpcServer.java:727)
解决:需要hive对/user/bi的读写权限,同样借助acl
hadoop fs -setfacl -R -m user:hive:rwx /user/bi 解决之
我在社区的提问,貌似没人回答https://community.cloudera.com/t5/Cloudera-Manager-Installation/with-sentry-and-ldap-hive-can-t-load-data-from-hdfs-or-local/m-p/25413
问题四:
使用udf时,参见http://www.cloudera.com/content/cloudera/en/documentation/core/v5-2-x/topics/cm_mc_hive_udf.html
具体步骤:
1.将jar包放在hive server2节点指定目录,例如/tmp/udf
2.在cloudera manger hive cofiguration中Hive Auxiliary JARs Directory设置如上录
3.重启hive并分发hive client配置(hive加载了jar包)
4.beeline hive超级用户下执行:
create role udf_all_role;
GRANT ALL ON URI 'file:///tmp/udf/hive-udf.jar' TO ROLE udf_all_role;
GRANT ROLE udf_all_role TO GROUP bi;
5.测试
使用bi账户beeline登陆,并执行 CREATE TEMPORARY FUNCTION nvl AS 'com.nexr.platform.hive.udf.GenericUDFNVL';即可。
问题五:
hue配置impala服务时,hue报错
No available Impalad to send queries to.
User 'hive' is not authorized to delegate to 'hue'.
hue中的beewax可正常使用,hive的库权限控制正常使用,但在hue配置中,选择基于impala service时,报如上错误。
解决:
[desktop]
ldap_username=hue
ldap_password=111111
配置hue账号,不能使用hive或者其他admin账号。