I have started learning hadoop recently and I am getting the below error while creating new folder -
vm4learning@vm4learning:~/Installations/hadoop-1.2.1/bin$ ./hadoop fs -mkdir helloworld Warning: $HADOOP_HOME is deprecated. 15/06/14 19:46:35 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:9000. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
Request you to help.below are the namdenode logs -
015-06-14 22:01:08,158 INFO org.apache.hadoop.hdfs.server.common.Storage: Storage directory /home/vm4learning/Installations/hadoop-1.2.1/data/dfs/name does not exist 2015-06-14 22:01:08,161 ERROR org.apache.hadoop.hdfs.server.namenode.FSNamesystem: FSNamesystem initialization failed. org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: Directory /home/vm4learning/Installations/hadoop-1.2.1/data/dfs/name is in an inconsistent state: storage directory does not exist or is not accessible. at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:304) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.loadFSImage(FSDirectory.java:104) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.initialize(FSNamesystem.java:427) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.(FSNamesystem.java:395) at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:299) at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:569) at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1479) at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1488) 2015-06-14 22:01:08,182 ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: Directory /home/vm4learning/Installations/hadoop-1.2.1/data/dfs/name is in an inconsistent state: storage directory does not exist or is not accessible. at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:304) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.loadFSImage(FSDirectory.java:104) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.initialize(FSNamesystem.java:427) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.(FSNamesystem.java:395) at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:299) at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:569) at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1479) at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1488)
2015-06-14 22:01:08,185 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down NameNode at vm4learning/192.168.1.102 ************************************************************/
Before starting to create a directory, you should be sure that your hadoop installation is correct, through
jps
command, and looking for any process missing.In your case, the namenode isn't up.
If you see in the logs, it appears to be that some folders aren't created. Do this:
And specify in hdfs-site.xml the following.
Reinitialize hadoop, and remember to format previous to do anything.