hadoop permissions issue (hdfs-site.xml dfs.permissions.enabled)

8.4k Views Asked by At

I recently installed Hadoop on my machine. And i'm having permissions issue. I logged in as user rahul and tried to create an directory in HDFS(hdfs dfs -mkdir /rahul_workspace). But it gave me an error Permission denied: user=Rahul, access=WRITE, inode="/user":hdfs:hdfs:drwxr-xr-x. A quick Google search for this error lead to many responses suggesting the workaround of disabling permission checking by setting the dfs.permissions property in hdfs-site.xml to false. Now i can crate the directories in HDFS. With the above property set to false can i access all the other hadoop services such as hive, pig, sqoop etc.,?

4

There are 4 best solutions below

0
On

dfs.permissions is the property which can be used to enable/disable HDFS ACL (simple permission). With dfs.permission set to false any user can create/delete files/directories anywhere in HDFS.

With this property set to false, You can access all other Hadoop services such as Hive, Pig etc.

0
On

HDFS treats the user who started the namenode process as the 'superuser'. This user is typically the hdfs user. The /user directory you are trying to write to is owned by hdfs user and hdfs group with read and execute permissions to any user in hdfs group and others.

Disabling permissions seems a little extreme. You have a few options here, the first is to create your user directory as the hdfs user.

sudo -u hdfs hdfs fs -mkdir /rahul_workspace
sudo -u hdfs hdfs fs -chown Rahul /rahul_workspace

This will create your directory and change the ownership to you. Another option is to create your directory in an area where you already have write permissions. Typically /tmp is wide open.

hdfs fs -mkdir /tmp/rahul_workspace
0
On

On Cloudera: What I do, created one user with default permissions on Hue i.e. user1

Then, from linux shell, First do access hdfs user permissions : export HADOOP_USER_NAME=hdfs

Then for your linux user name like "user1":

try: hdfs dfs -chown user1:user1 /

Now if you got the proper permissions:

try to create directory on '/': hdfs dfs -mkdir /testuser1

0
On

We have used

hadoop fs -setfacl -m -R user:xyz:rwx /data/dev/dl/fvc/df

to overcome this problem on top level directory(/data).