I installed hadoop-1.2.1 and eclipse kepler, latest version, and jdk-1.7.0. And I just followed the steps described in http://hadoop.apache.org/docs/r1.2.1/single_node_setup.html#PseudoDistributed and I set the configurations as follows:
conf/core-site.xml:
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>
conf/hdfs-site.xml:
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
</configuration>
conf/mapred-site.xml:
<configuration>
<property>
<name>mapred.job.tracker</name>
<value>localhost:9001</value>
</property>
</configuration>
Finally I could operate Hadoop in ubuntu Terminal. However, When I installed the eclipse plugin, and set the port of Map/Reduce Master as 9001, and that of DFS Master as 9000, I could not connect to hadoop with an error:
Error: Call to loaclhost/127.0.0.1:9000 failed on connection exception:java.net:ConnectionException
Even though it did not make a problem when I connected to Hadoop(start-all.sh) using terminal, I could not connect to Haddop by eclipse.
(I even allowed port number 9000, 9001, but it didn't solve the problem)
You need to build the hadoop-eclipse plugin jar, as newer versions of Hadoop do not ship the jar anymore. The source can be found inside $HADOOP_HOME/src/comtrib/ with a folder named eclipse-plugin.
I followed this guide and was able to set eclipse to run hadoop programs.
Guide to build eclipse-plugin
Next I went thru this guide to run Hadoop programs directly from Eclipse
Run hadoop from inside of Eclipse
Hope this helps.