Background
I have a Spring Batch job where :
FlatFileItemReader
- Reads one row at a time from the fileItemProcesor
- Transforms the row from the file into aList<MyObject>
and returns theList
. That is, each row in the file is broken down into aList<MyObject>
(1 row in file transformed to many output rows).ItemWriter
- Writes theList<MyObject>
to a database table. (I used this implementation to unpack the list received from the processor and delegae to aJdbcBatchItemWriter
)
Question
- At point 2) The processor can return a
List
of 100000MyObject
instances. - At point 3), The delegate
JdbcBatchItemWriter
will end up writing the entireList
with 100000 objects to the database.
My question is : The JdbcBatchItemWriter
does not allow a custom batch size. For all practical purposes, the batch-size = commit-interval for the step. With this in mind, is there another implementation of an ItemWriter
available in Spring Batch that allows writing to the database and allows configurable batch size? If not, how do go about writing a custom writer myself to acheive this?
I see no obvious way to set the batch size on the
JdbcBatchItemWriter
. However, you can extend the writer and use a customBatchPreparedStatementSetter
to specify the batch size. Here is a quick example:The StagingItemWriter in the samples is an example of how to use a custom
BatchPreparedStatementSetter
as well.