camel jdbc out of memory exception

1.1k Views Asked by At

I am trying to ingest data from postgres to another DB and I am using camel-jdbc component to do it. I have a large table so I want to read few rows at a time instead of the whole table altogether. so my route looks like below (only for testing purpose) from(fromUri).setBody("select * from table limit 10").to("jdbc://myDataSource?resetAutoCommit=false&statement.fetchSize=2").split(body()).streaming().process(test)

As shown above, I am only getting 10 rows at a time for testing purpose and I have set fetchSize to 2 to only receive 2 rows at a time. However, I am still receiving all 10 rows altogether. When I remove the "limit 10" from the query I get Out of Memory error just before the split command which tells me that its trying to load the entire result set in memory.

What am I missing here or what am I doing wrong?

Thanks for help.

1

There are 1 best solutions below

4
On

I think fetchSize is more of a hint to the JDBC driver. You can use the maxRows option to really limit on the server side, eg statement.maxRows=2. You can read more about these options on the JDBC javadoc documentation.

https://docs.oracle.com/javase/7/docs/api/java/sql/Statement.html