I have two virtual machines, first one with the Apache Ranger and LDAP running, second one with Apache Spark running. I want to use Ranger Policies to control the user's querys in Beeline, for example.
The procedure I'm doing now is running $SPARK_HOME/sbin/start-thriftserver
and then creating a Resource Based Policy using the Hadoop SQL option in Ranger's web interface.
Creating policy and sucess in connection with Hiveserver2
After that I run enable-hive-plugin.sh
and I get in the console the message 'Ranger Plugin for hive has been enabled. Please restart hive to ensure that changes are effective.'
But when I return to Ranger Web UI, I can't see the plugin in Audit > Plugin Status.
Service Manager
Here is my $SPARK_HOME/conf/hive-site.xml
<configuration>
<!-- Thriftserver -->
<property>
<name>hive.server2.transport.mode</name>
<value>http</value>
</property>
<property>
<name>hive.server2.thrift.http.port</name>
<value>10001</value>
</property>
<property>
<name>hive.server2.thrift.bind.host</name>
<value>localhost</value>
</property>
<property>
<name>hive.server2.webui.port</name>
<value>10002</value>
</property>
<property>
<name>hive.server2.thrift.http.path</name>
<value>cliservice</value>
</property>
<!-- LDAP -->
<property>
<name>hive.server2.authentication</name>
<value>LDAP</value>
</property>
<property>
<name>hive.server2.authentication.ldap.url</name>
<value>ldap://10.142.15.210:10389</value>
</property>
<property>
<name>hive.server2.authentication.ldap.baseDN</name>
<value>ou=usuarios,dc=example,dc=com</value>
</property>
<property>
<name>hive.server2.authentication.ldap.Domain</name>
<value></value>
</property>
<!-- Ranger -->
<property>
<name>hive.security.authorization.enabled</name>
<value>true</value>
</property>
<property>
<name>hive.security.authorization.manager</name>
<value>org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizerFactory</value>
</property>
</configuration>
That is my ${HIVE_PLUGIN_PATH}/install.properties
POLICY_MGR_URL=10.142.15.210:6080 # IP FROM RANGER
REPOSITORY_NAME=hivedev
COMPONENT_INSTALL_DIR_NAME=/opt/spark
I tried following the answer described in here but it didn't work for me.
Some idea?