How to read a huge CSV file efficiently and convert CSV values into objects based on a user-defined class in Java

259 Views Asked by At

I have to read a large CSV file with about 700,000 records and compare the CSV data to an API response. I was able to use OpenCSV and make the code work. However, the deserialization process is extremely slow. It takes about an hour just to deserialize the data. I have been using the following code to read and deserialize my CSV.

     List<ProjectVO> csvValue = new CsvToBeanBuilder(new FileReader("project.csv"))
       .withType(ProjectVO.class).build().parse();

Is there any other efficient method to replace it?

My PersonVO class looks like this:

.
.
.
@JsonIgnoreProperties(ignoreUnknown = true)
public class ProjectVO {
@JsonProperty("actualCompletionDate")
@CsvBindByName(column = "actualCompletionDate")
private String actualCompletionDate;
.
.
.

I am comparing my CSV data and JSON response something like following:

assertEquals("The value for column 'actualCompletionDate' has the same data in both files for the ID: "
   + jsonValue.getId(), csvValue.getActualCompletionDate(), jsonValue.getActualCompletionDate());
0

There are 0 best solutions below