Importtsv command gives : Container exited with a non-zero exit code 1 error

5.1k Views Asked by At

I am trying to load a tsv file into an existing hbase table. I am using the following command:

/usr/local/hbase/bin$ hbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.columns=HBASE_ROW_KEY,cf:value '-Dtable_name.separator=\t' Table-name /hdfs-path-to-input-file

But when I execute the above command, I get the following error

Container id: container_1434304449478_0018_02_000001
Exit code: 1
Stack trace: ExitCodeException exitCode=1:
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
        at org.apache.hadoop.util.Shell.run(Shell.java:455)
        at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
        at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)


Container exited with a non-zero exit code 1
Failing this attempt. Failing the application.
2015-06-16 13:37:48,779 INFO  [main] mapreduce.Job: Counters: 0

On further digging the Container logs through the UI, I get the following error

2015-06-16 13:28:02,776 ERROR [main] org.apache.hadoop.mapreduce.v2.app.client.MRClientService: Webapps failed to start. Ignoring for now:
java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.yarn.server.webproxy.amfilter.AmFilterInitializer not found
    at org.apache.hadoop.conf.Configuration.getClasses(Configuration.java:1699)
    at org.apache.hadoop.http.HttpServer.getFilterInitializers(HttpServer.java:325)
    at org.apache.hadoop.http.HttpServer.<init>(HttpServer.java:267)
    at org.apache.hadoop.yarn.webapp.WebApps$Builder$2.<init>(WebApps.java:222)
    at org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.java:219)
    at org.apache.hadoop.mapreduce.v2.app.client.MRClientService.serviceStart(MRClientService.java:136)
    at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
    at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceStart(MRAppMaster.java:1058)
    at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
    at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.run(MRAppMaster.java:1445)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
    at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1441)
    at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1374)
Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.yarn.server.webproxy.amfilter.AmFilterInitializer not found
    at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1626)
    at org.apache.hadoop.conf.Configuration.getClasses(Configuration.java:1695)

I am using Hadoop-2.6.0 and Hbase-0.98 versions on a multi node cluster.

I am not able to understand what's causing this error. Please guide me.

1

There are 1 best solutions below

0
On BEST ANSWER
java.lang.ClassNotFoundException: Class org.apache.hadoop.yarn.server.webproxy.amfilter.AmFilterInitializer not found

The above line clearly indicates you are missing the web proxy jar.

You need to add the following jar in your Hbase lib folder.

hadoop-yarn-server-web-proxy-2.6.0.jar