Hadoop 2.7.2 - Error: Could not find or load main class org.apache.hadoop.hdfs.server.namenode.NameNode

6.3k Views Asked by At

I am newbie and I am trying to find out a solution to this problem. I have followed this turorial in order to setup Hadoop 2.7.2 on Ubuntu 15.10

http://idroot.net/tutorials/how-to-install-apache-hadoop-on-ubuntu-14-04/

When I launch "hdfs namenode format" I continue to receive this error Error: Could not find or load main class org.apache.hadoop.hdfs.server.namenode.NameNode

this is the bashrc content

export JAVA_HOME=/usr/lib/jvm/java-8-oracle

export HADOOP_INSTALL=/usr/local/hadoop

export PATH=$PATH:$HADOOP_INSTALL/bin

export PATH=$PATH:$HADOOP_INSTALL/sbin

export HADOOP_MAPRED_HOME=$HADOOP_INSTALL

export HADOOP_COMMON_HOME=$HADOOP_INSTALL

export HADOOP_HDFS_HOME=$HADOOP_INSTALL

export YARN_HOME=$HADOOP_INSTALL

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native

export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"

Can anyone help me to solve this (stupid I think) question?

Many thanks Kama

4

There are 4 best solutions below

4
On

First make sure that the directories namenode and datanode already exist in the location specified within the hdfs-site.xml file. You can use the command mkdir to create those.

Then try to format the namenode using

hdfs namenode -format

or

/usr/local/hadoop/bin/hdfs namenode -format

Please note the hyphen.


My bashrc configuration for hadoop:

#HADOOP VARIABLES START
export HADOOP_HOME=/usr/local/hadoop
export PATH=$PATH:$HADOOP_HOME/bin
export PATH=$PATH:$HADOOP_HOME/sbin
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
export HADOOP_CLASSPATH=${JAVA_HOME}/lib/tools.jar
#HADOOP VARIABLES END
0
On

Problem solved using the Ashes Script. Main difference is the usage of open jdk instead of oracle jre.

Thanks for the help!

0
On

One cause behind this problem might be a user-defined HDFS_DIR environment variable. This is picked up by scripts such as the following lines in libexec/hadoop-functions.sh:

HDFS_DIR=${HDFS_DIR:-"share/hadoop/hdfs"}
...
if [[ -z "${HADOOP_HDFS_HOME}" ]] &&
   [[ -d "${HADOOP_HOME}/${HDFS_DIR}" ]]; then
  export HADOOP_HDFS_HOME="${HADOOP_HOME}"
fi

The solution is to avoid defining an environment variable HDFS_DIR.

In my case, using sudo helped but for the wrong reasons: there was no problem with the permissions but with the environment variables.

0
On

I have this error too. For me, it's just because some files in /share/hadoop/yarn/ folder is missing, which was caused by an incomplete download of hadoop.tar.gz that can still be abstracted by command line. May help you, cheers.