I am following this guide on Github and I am not able run the example mapreduced job mentioned in Step 5.
I am aware that this file no longer exists:
/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar
And I am aware that the same file can now be found here:
/usr/lib/hadoop-0.20/hadoop-examples-0.20.2-cdh3u6.jar
So I form my call as below:
curl -v -X POST "http://computing.cosmos.lab.fiware.org:12000/tidoop/v1/user/$user/jobs" -d '{"jar":"/usr/lib/hadoop-0.20/hadoop-examples-0.20.2-cdh3u6.jar","class_name":"WordCount","lib_jars":"/usr/lib/hadoop-0.20/hadoop-examples-0.20.2-cdh3u6.jar","input":"testdir","output":"testoutput"}' -H "Content-Type: application/json" -H "X-Auth-Token: $TOKEN"
The input directory exists in my hdfs user space and there is a file called testdata.txt inside it. The testoutput folder does not exist in my hdfs user space since I know it creates problems.
When I execute this curl command, the error I get is {"success":"false","error":1}
which is not very descriptive. Is there something I am missing here?
This has been just tested with my user
frb
and a valid token for that user:Please observe the fat jar with the MapReduce examples in the "new" cluster (computing.cosmos.lab.fiware.org) is at
/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar
, as detailed in the documentation./usr/lib/hadoop-0.20/hadoop-examples-0.20.2-cdh3u6.jar
was the fat jar in the "old" cluster (cosmos.lab.fiware.org).EDIT 1
Finally, the user had no account in the "new" pair of clusters of Cosmos in FIWARE LAB (
storage.cosmos.lab.fiware.org
andcomputing.cosmos.lab.fiware.org
), where Tidoop runs, but in another "old" cluster (cosmos.lab.fiwre.org
). Thus, the issue was fixed by simply provisioning an account in the "new" ones.