I'm writing a program which copies files from one PC to another across a network. While this file copy is going on, I display the progress to the user and how much of the file has been copied dynamically (done in JSP). On the server side I'm using Java and things are working great for most files but if I come across one that is 50 MB I hit some what of a wall and my program just sits there for awhile, it eventually starts writing to the output stream though. Is this due to just loading the file into the input stream? I know the file is a little big so not sure if there's much I can do about it. Below is the code I'm using to copy the file to the server. Any suggestions or help would be much appreciated. Totally open redesigning this thing I just want to have it easy to read the file progress copied over.
FileInputStream fin = null;
FileOutputStream fout = null;
int length = (int)srcFile.length();
int counter = 0;
int r = 0;
byte[] b = new byte[4096];
try
{
fin = new FileInputStream(srcFile);
fout = new FileOutputStream(dd);
int progressVal = 0;
while((r = fin.read(b)) != - 1)
{
counter += r;
fout.write(b,0,r);
progressVal = (counter * 100/length);
if(progressVal > 0)
{
//communicate to JSP
WorkstationController.progressMade = progressVal;
}
}
}
catch (Exception e)
{
e.printStackTrace();
}
finally
{
fin.close();
fout.close();
}
could depend on many factors. what version of java? depending on the version, the default heap and permgen size vary. you might be hitting the heap size limit and causing the jvm to thrash in garbage collection to recover space.
one of the things you can do is try differing sizes for chunking the data. also, try wrapping the streams BufferedInput/OuputStreams.