I am using hbase client 2.1.7 to connect to my server(same version 2.1.7).
<groupId>org.apache.hbase</groupId> <artifactId>hbase-client</artifactId> <version>2.1.7</version>
Now there is an user who have permission to read/write on the table in the server.
User = LTzm@yA$U
For this my code looks like this:
String hadoop_user_key = "HADOOP_USER_NAME";
String user = "LTzm@yA$U";
System.setProperty(hadoop_user_key, token);
Now when I am trying to read the key from the table i am getting following error:
error.log:! Causing: org.apache.hadoop.hbase.security.AccessDeniedException: org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient permissions for user 'LTzm' (table=table_name, action=READ)
Weird part is writes are working fine. To validate that whether right user is getting passed for write, i removed the user and try rerun the code and the write fails with the error:
error.log:! org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient permissions (user=LTzm@yA$U, scope=table_name, family=d:visitId, params=[table=table_name,family=d:visitId],action=WRITE)
Again read was also failing with:
error.log:! org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient permissions for user 'LTzm' (table=table_name, action=READ)
Somehow Ltzm is getting passed with read call and LTzm@yA$U is getting passed for write. Does anyone help me what is the issue here, Is @ or special symbol not allowed in the user for hbase(then how is it working for write calls).
Edit 1: Here is the function to create connection:
public static Connection createConnection() { String hadoop_user_key = "HADOOP_USER_NAME"; String user = "LTzm@yA$U"; Map<String, String> configMap = new HashMap<>(); configMap.put("hbase.rootdir", "hdfs://session/apps/hbase/data")); configMap.put("hbase.zookeeper.quorum", "ip1, ip2"); configMap.put("zookeeper.znode.parent", "/hbase"); configMap.put("hbase.rpc.timeout", "400"); configMap.put("hbase.rpc.shortoperation.timeout", "400"); configMap.put("hbase.client.meta.operation.timeout", "5000"); configMap.put("hbase.rpc.engine", "org.apache.hadoop.hbase.ipc.SecureRpcEngine"); configMap.put("hbase.client.retries.number", "3"); configMap.put("hbase.client.operation.timeout", "3000")); configMap.put(HConstants.HBASE_CLIENT_IPC_POOL_SIZE, "30")); configMap.put("hbase.client.pause", "50")); configMap.put("hbase.client.pause.cqtbe", "1000")); configMap.put("hbase.client.max.total.tasks", "500")); configMap.put("hbase.client.max.perserver.tasks", "50")); configMap.put("hbase.client.max.perregion.tasks", "10")); configMap.put("hbase.client.ipc.pool.type", "RoundRobinPool"); configMap.put("hbase.rpc.read.timeout", "200")); configMap.put("hbase.rpc.write.timeout", "200")); configMap.put("hbase.client.write.buffer", "20971520")); System.setProperty(hadoop_user_key, token); Configuration hConfig = HBaseConfiguration.create(); for (String key : configMap.keySet()) hConfig.set(key, configMap.get(key)); UserGroupInformation.setConfiguration(hConfig); Connection hbaseConnection; hbaseConnection = ConnectionFactory.createConnection(config); return connection; }
Here are the read and write calls:
protected Result read(String tableName, String rowKey) throws IOException { Get get = new Get(Bytes.toBytes(rowKey)); get.addFamily(COLUMN_FAMILY_BYTES); Result res; Table hTable = null; try { hTable = getHbaseTable(tableName); res = hTable.get(get); } finally { if (hTable != null) { releaseHbaseTable(hTable); } } return res; } protected void writeRow(String tableName, String rowKey, Map<String, byte[]> columnData) throws IOException { Put cellPut = new Put(Bytes.toBytes(rowKey)); for (String qualifier : columnData.keySet()) { cellPut.addColumn(COLUMN_FAMILY_BYTES, Bytes.toBytes(qualifier), columnData.get(qualifier)); } Table hTable = null; try { hTable = getHbaseTable(tableName); if (hTable != null) { hTable.put(cellPut); } } finally { if (hTable != null) { releaseHbaseTable(hTable); } } } private Table getTable(String tableName) { try { Table table = hbaseConnection.getTable(TableName.valueOf(tableName)); } catch (IOException e) { LOGGER.error("Exception while adding table in factory.", e); } }