I am updating some old code to capture some binary data from a URL, not from a database (the data will be moved from the database and will be accessible via HTTP instead). It seems that the database API provided the data as an array of raw bytes, and this code wrote this array to a file using BufferedOutputStream.
I am not completely familiar with Java, but a bit on a search query led me to this code:
URL u = new URL("my-url-string"); URLConnection uc = u.openConnection(); uc.connect(); InputStream in = uc.getInputStream(); ByteArrayOutputStream out = new ByteArrayOutputStream(); final int BUF_SIZE = 1 << 8; byte[] buffer = new byte[BUF_SIZE]; int bytesRead = -1; while((bytesRead = in.read(buffer)) > -1) { out.write(buffer, 0, bytesRead); } in.close(); fileBytes = out.toByteArray();
This works most of the time, but I have a problem when the copied data is large - I get an OutOfMemoryError for data items that work fine with old code.
I assume that since this version of the code has several copies of the data in memory at the same time, while the source code did not.
Is there an easy way to capture binary data from a URL and store it in a file without resorting to the cost of several copies in memory?
source share