I have a requirement to parse a CSV file which can contain 100 to 10000 rows. Inorder to do this I am using the below route configuration.
from(inputFileUri)
.routeId(CUSTOm_ROUTEID).delayer(1000)
.split(body().tokenize("\n", 100, true))
.unmarshal(new BindyCsvDataFormat(CustomObject.class))
.convertBodyTo(List.class)
.process(customProcessor);
I am able to do the needful. But I want to log how many rows were successfully parsed and how many had errors and log error rows separately and maybe write them to a new csv file once after the input csv file is processed. Is there a camel way to do it?
If you want to write your error rows to a new CSV file and not get in the way of processing your good batches, you probably want to look into a DeadLetterChannel (
DLC).But my guess is if 1 out of your 100 rows has an issue, then then entire 100 rows you're splitting on are never going to make it to your processor.
I feel like you are better off splitting on your new line without the
100, then aggregate after yourunmarshal. Rows that fail to unmarshal can be handled by aDLCand rows that successfully get converted to aCustomObjectcan be grouped up with an aggregator and sent to yourcustomProcessoras aList<CustomObject>.You don't even need to write your own AggregationStrategy - you can use
GroupedBodyAggregationStrategyIt would probably look something like this