I am trying to read XML file details from Sun Identity Manager database using below snippet
String xmlTxt="";
ArrayList<String> xmlList = new ArrayList<String>();
BLOB blob=null;
OracleConnection conn=null;
OraclePreparedStatement stmt = null;
OracleResultSet rs = null;
GZIPInputStream gStream = null;
log.debug("Initializing DB connection...");
try {
Class.forName("oracle.jdbc.driver.OracleDriver");
conn=(oracle.jdbc.OracleConnection)DriverManager.getConnection(dbURL, dbUserName, dbPassword);
stmt = (OraclePreparedStatement) conn.prepareCall("SELECT XML FROM TASK WHERE TYPE='WorkItem'");
rs = (OracleResultSet) stmt.executeQuery();
while(rs.next()){
blob = rs.getBLOB(columnLabel);
if(!blob.isEmptyLob())
gStream = new GZIPInputStream(blob.getBinaryStream());
if(gStream.available()>0)
xmlTxt = IOUtils.toString(gStream,columnEncoding);
xmlList.add(xmlTxt);
}
}
catch(ClassNotFoundException cnf){
//Handle errors for Class.forName
log.error("ClassNotFoundException " + cnf.getMessage());
log.error("ClassNotFoundException " + cnf);
}
catch (SQLException se) {
//Handle errors for JDBC
log.error("SQLException " +se.getMessage());
log.error("SQLException " + se);
}
catch (IOException ie) {
//Handle errors for I/O
log.error("IOException " +ie.getMessage());
log.error("IOLException " + ie);
}
catch (OutOfMemoryError E){
log.error("OutOfMemoryError Encountered :"+ Runtime.getRuntime().totalMemory());
}
finally{
try {
if(rs!=null)
rs.close();
if(stmt!=null)
stmt.close();
if(blob!=null)
blob.free();
if(conn!=null)
conn.close();
}
catch (SQLException se) {
//Handle errors for JDBC
log.error("SQLException during close " +se.getMessage());
log.error("SQLException during close" + se);
}
}
log.debug("End of DB Operation");
return xmlList;
Currently I am testing it in our Dev environment where the records are in 100's and average size of each blob data is 2 MB. Also I could see consumption of heap size is huge when there is steady increase in total records. My current Java heap size is Xmx512m -Xms64m and in prod we have nearly 1.5 million records to process so wondering how much heap size would be required.
I usually get out of memory error for below line
xmlTxt = IOUtils.toString(gStream,columnEncoding);
https://commons.apache.org/proper/commons-io/apidocs/org/apache/commons/io/IOUtils.html
Please advise if there are any other better way we can optimize the code to improve performance.
Parse the data from table in streaming fashion rather than copying it to memory as string/array list.