Efficiently generate many Zip-Files

I have a huge basic set of files (retrieved by parsing other files) for which I want to generate many zip-files with one additional parameter file:

zip_file_n = basic_set_of_files + parameter_file_n

As I understood from the Compress::Zlib::Writer class description, the zip content is written to the IO object when the Compress::Zlib::Writer open-block is “exited” and thus is not available anymore.

Can one “keep” the zip data for the basic set files and just add the zip data of the parameter file in order
to efficiently generate many zip files ?

If I understand you correctly, you could in theory write the basic files to an IO::Memory, then when creating the various archives, rewind/copy the data from the memory IO, then add your extra parameter file. This of course depends on what you mean by “efficiently” as the data stored in this IO would be kept in memory, which could be a problem depending on the amount of data.

I’d probably just start out not sharing anything and see if thats fast enough already before trying to optimize it further.

An alternative to a memory buffer would be to write the common part to a temp file, then copy that at the beginning of each derivative file.