Olive issue does not work with custom authenticator provider

I have developed my own authenticator provider, and everything seems to be correct with respect to authentication: HiveServer2 starts up well, and authenticated connections are validated correctly. Even simple bush queries work, such as show tables .

The problem is that I am trying to execute a request from a remote Hive client. Since I connected the transfer of my credentials (user + password ... well, actually not a password, it is a token, but it is not relevant), and the Hive configuration is ready for impersonation (see below), I expected HiveServer2 to fulfill the request as my user. However, it uses a hive user who does not have rights to my HDFS user space.

For example, if I create a table:

 > create external table mytable (name string, job string, age string) row format delimited fields terminated by ',' location '/user/frb/testdir' Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException Permission denied: user=hive, access=EXECUTE, inode="/user/frb":frb:frb:drwxr----- at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:168) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5519) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3517) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:785) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:764) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2013) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2009) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2007) ) 

Regarding the relevant configuration options, this is my configuration (simplified, not like XML):

hive-site.xml :

 com.telefonica.iot.idm.endpoint=https://account.lab.fiware.org hive.server2.enable.doAs=true hive.server2.authentication=CUSTOM hive.server2.custom.authentication.class=com.telefonica.iot.cosmos.hive.authprovider.OAuth2AuthenticationProviderImpl 

core-site.xml :

 hadoop.proxyuser.hive.groups=* hadoop.proxyuser.hive.hosts=* 

You can look at the Hive authentication provider custom code here .

I am using Hive 0.13.0

Can anyone help me please?

+8
hadoop hive
source share
1 answer

Please check if the value of "hive.metastore.execute.setugi" is set to true, as in step 1 and step 2 in the HiverServer2 node (s) host and the Hive Metastore node (s) host server, respectively. Please note that this property must be set both on the client / gateway nodes and on the HiverServer2 and Hive Metastore hosts.

Link: http://doc.mapr.com/display/MapR/Hive+User+Impersonation

0
source share

All Articles