Hadoop: JAVA_HOME is not set

6.7k Views Asked by At

Like everyone else in the world, I'm following this hadoop tutorial. I get to the point where I format HDFS, and I get this:

user@linux01:~$ sudo $HADOOP_INSTALL/bin/hadoop namenode -format
Error: JAVA_HOME is not set.

Well that's funny, I set JAVA_HOME in my /etc/profiles.

user@linux01:~$ tail -n 4 /etc/profile
export JAVA_HOME=/usr/local/jdk1.6.0_32/bin
export JDK_HOME=$JAVA_HOME
export PATH=$PATH:/usr/local/jdk1.6.0_32/bin
export HADOOP_INSTALL=/usr/local/hadoop/hadoop-1.0.3

Did I mess that up somehow?

user@linux01:~$ echo $JAVA_HOME
/usr/local/jdk1.6.0_32/bin
user@linux01:~$ ls $JAVA_HOME
appletviewer  extcheck       jar        javac    and so forth...

Seems to work. Maybe it absolutely has to be set in my hadoop-env.sh?

# The java implementation to use.  Required.
export JAVA_HOME=$JAVA_HOME

Lazy, yeah, but I still get "JAVA_HOME is not set" with or without this comment. I'm running low on ideas. Anyone see anything I'm missing?

3

There are 3 best solutions below

0
On BEST ANSWER

Thank you @Chris Shain and @Chris White for your hints. I was running hadoop as su, and su doesn't automatically know about the environmental variables I set. I logged in as my hadoop user (I had chown'd the hadoop install directory to this user), and was able to format the hdfs.

Secondary problem: When I tried to start Hadoop, NameNode and JobTracker started successfully but DataNode, SecondaryNameNode, and TaskTracker failed to start. I dug in a little bit. NameNode and JobTracker are started via hadoop-daemon.sh, but DataNode, SecondaryNameNode, and TaskTracker are started by hadoop-daemon*s*.sh. The resolution here was to properly set JAVA_HOME in conf/hadoop-env.sh.

0
On

Try to use short name to avoid a space in a path.

"C:\Program Files" should have short name C:\Progra~1 (you can verify it using DOS dir command or entering it into address bar in file explorer).

Set your JAVA_HOME this way:

export JAVA_HOME="/cygdrive/c/Progra~1/Java/jdk1.7.0_10"

Answered by user2540312

0
On

First a general note: If you are working on CYGWIN, please be logged in to your system as an Administrator.

I faced this problem of Java_Home not found while executing namenode -format. Below is what i did to fix it

  1. Reinstalled JDK out of program files in a location where there was no space in the folder name. For example: D:/Installations/JDK7

  2. Got into the bin folder of hadoop (version 1.2.1) installation and edited the "hadoop" configuration file. This is the file which has no file extension.

  3. Searched for java_home variable

  4. Just before the first instance of variable $JAVA_HOME I added this line:

export JAVA_HOME=/cygdrive/D/Installations/JDK7/

This is how it looks now:

fi

export JAVA_HOME=/cygdrive/D/Installations/JDK7/

# some Java parameters
if [ "$JAVA_HOME" != "" ]; then
  #echo "run java in $JAVA_HOME"
  JAVA_HOME=$JAVA_HOME
fi

Note: /cygdrive/ has to preceed your jdk installation path. Colon after the drive letter is not needed and the path should have forward slash.

Additionally, I did set JAVA_HOME in the system env properties. Just to be double sure.

Run the program and it will work now.

Blockquote