I have a problem that requires me to parse a text file from the local machine. There are several complications:
- Files can be quite large (700mb +)
- The pattern occurs in several lines.
- I need store line info after template
I created simple code using BufferReader , String.indexOf and String.substring (to get element 3).
Inside the file there is a key (template) with the name code= , which occurs many times in different blocks. The program reads each line from this file using BufferReader.readLine . It uses indexOf to check if the pattern appears, and then extracts the text after the pattern and stores it on a common line.
When I ran my program with a 600 MB file, I noticed that performance was worst when processing a file. I read an article in CodeRanch that the Scanner class does not execute for large files.
Are there any methods or library that could improve my performance?
Thanks in advance.
Here is my source code:
String codeC = "code=["; String source = ""; try { FileInputStream f1 = new FileInputStream("c:\\Temp\\fo1.txt"); DataInputStream in = new DataInputStream(f1); BufferedReader br = new BufferedReader(new InputStreamReader(in)); String strLine; boolean bPrnt = false; int ln = 0; // Read File Line By Line while ((strLine = br.readLine()) != null) { // Print the content on the console if (strLine.indexOf(codeC) != -1) { ln++; System.out.println(strLine + " ---- register : " + ln); strLine = strLine.substring(codeC.length(), strLine.length()); source = source + "\n" + strLine; } } System.out.println(""); System.out.println("Lines :" + ln); f1.close(); } catch ( ... ) { ... }
Zerdt source share