Java Recursive File Copy optimization -
I have a small application that copies the PDF files (subfolder) into a destination folder but it is very slow Works, I want to customize it. Can you help me
Code:
Public zero pdf foldercopy (file name, file file) throws IOException {if (src.isirectory ()) {if (! Dest.exists ()) {Dest.mkdir (); } String files [] = SRClist (); (String file: files) {file srcFile = new file (src, file); File destFile = new file (file, file); PdfFolderCopy (srcFile, destFile); }} And {if (Dest.exists ()) {System.out.println ("copy:" + src); // Use Apache IO copyFile method: FileUtils.copyFile (src, dest); }}}
If every file already exists, it lasted for about half an minute. And about 5 minutes, if we need to copy 500 files.
I will just try to do these >
Where -R
means recursive and a process
/ bin / cp -r- -n
is not overwritten meaning a good The chance is that the OS can speed it up, then you
For this you should only
new processbilder () .command ("/ bin / cp", "-" R "," -n ", Src.toString (), dest.toString ()) .start ();
If you want to do this in Java, then I will try a few minor changes:
-
dest.mkdir ()
- Any check can be slightly faster without doing
listFiles < / Code> can be faster than writing them manually (maybe irrelevant)
dest
folders themselves, you do not need to check it Is there any pre-existing file
I think, the much anticipated can proceed at a good speed: let's create the main thread copy jobs and give them some executives (some 4-8
Note that writing such multithreaded write can cause high disk fragmentation, but I will not care. If I had to, I would rather create file-printing jobs I would like them L'material should return (n * 100 KB) nothing, and use a writer thread.
Comments
Post a Comment