Hi *,
I am faced with out of memory situations while I am trying to stream a large number of documents which shall be delivered in zip to the frontend.
What I did so far is (pseudocode):
...
response.setBufferSize(128 * 1024);
response.setHeader("Content-Disposition", "attachment; filename=\"" + exportName + "\"");
response.setContentType("application/x-zip-compressed");
final ZipOutputStream zipOut = new ZipOutputStream(response.getOutputStream());
for (zeAttachment: ListOfAttachments)
{
final ZipEntry zeAttachment = new ZipEntry(fileName);
zipOut.putNextEntry(zeAttachment);
final byte[] buffer = new byte[32768];
int r = attachmentStream.read(buffer);
while (r > 0)
{
zipOut.write(buffer, 0, r);
r = attachmentStream.read(buffer);
}
}
finally
{
zipOut.close();
}
...
When I now use the debugger, I see that everything is working, except the download begins after all Attachment were processed. Zip file is much larger then the response buffer with its 128KB...
What am I doing wrong? - Is there something to regard on the Frontend (JavaScript) side?
Best regards,
Christian