Copy files using apache fileutil.copyfile

I am using apache util to copy a file to a directory using fileutil.copyFile(src, dest)

The file that I copy is updated every 2 seconds by an external provider, and I really do not want to block it (my application runs on Windows - this will cause all kinds of problems). I hope someone can help me with advice, what is the safest way to copy or even read a file without locking the source file?

Yours faithfully

+7
source share
3 answers

Since you do not explicitly lock the entire file before the copy action, the os file locking mechanism by default works.

I ran a quick test program to find out what happens on a Windows machine when you copy the source file and the external process writes to the file every 2 seconds.

A process that writes to a file has never encountered a problem.

 public static void main(String[] args) { File f = new File("..\\test.txt"); long startTime = System.currentTimeMillis(); long elapsedTime = 0; while (elapsedTime < 1000 * 60) { try { FileUtils.writeStringToFile(f, System.currentTimeMillis()+" : Data Write\r\n", true); Thread.sleep(2000); } catch (IOException ex) { Logger.getLogger(App.class.getName()).log(Level.SEVERE, null, ex); } catch (InterruptedException ex){ Logger.getLogger(App.class.getName()).log(Level.SEVERE, null, ex); } elapsedTime = System.currentTimeMillis() - startTime; } } 

The process that copies the file throws an exception if it does not complete the copy before the source file changes its length. This exception seems to be more of a warning that the copied version of the file is incomplete. When I synchronized the time so as not to read from the file, at the same time, writing occurs, this exception was not selected.

 public static void main(String[] args) { File f = new File("..\\test.txt"); long startTime = System.currentTimeMillis(); long elapsedTime = 0; while (elapsedTime < 1000 * 60) { try { FileUtils.copyFile(f, new File("..\\test\\test.txt")); Thread.sleep(2000); } catch (IOException ex) { Logger.getLogger(App.class.getName()).log(Level.SEVERE, null, ex); } catch (InterruptedException ex){ Logger.getLogger(App.class.getName()).log(Level.SEVERE, null, ex); } elapsedTime = System.currentTimeMillis() - startTime; } } 

Based on this test, I would not worry about what happens with the writing process. I would do something to handle the case when java.io.IOException: Failed to copy full contents from '..\test.txt' to '..\test\test.txt' is thrown java.io.IOException: Failed to copy full contents from '..\test.txt' to '..\test\test.txt' .

+2
source

This is an excerpt from How to cache a file in Java , you can find more there.

Java file caching


Reading files from disk can be slow, especially when an application reads the same file many times. Caching solves this problem by storing frequently used files in memory. This allows the application to read contents from fast local memory instead of a slow hard drive. The design for caching a file in Java includes three elements:

  • File Caching Algorithm
  • Data structure for storing cached content
  • Caching API for storing cached files

Algorithm for Caching Files


The general file caching algorithm should consider file changes and consists of the following steps:

  • Get the value from the cache using the full path to the file as the key.
  • If the key is not found, read the contents of the file and put it in the cache.
  • If the key is found, check to see if the timestamp of the cached content matches the timestamp of the file.
  • If the timestamps are equal, return the cached content.
  • If the timestamps are not equal, update the cache by reading the file and putting it in the cache.
+1
source
0
source

All Articles