How to keep track of time elapsed during a Hadoop Job?

76 Views Asked by At

I am attempting to make an open-source contribution to the Apache Hadoop DSL plugin. Basically I am making a new job class (let's call it xyzJob) that extends Job, and I need a way to keep track of the time elapsed during an xyzJob. Once a certain amount of time has elapsed, I need to force another action, but I cannot figure out a way to keep track of the time correctly. mapred.task.timeout does not work because it will forcefully terminate the job instead of letting me force the other action. Any ideas/suggestions?

0

There are 0 best solutions below