Hello,
I’m doing a small proxy, I don’t know if I risk overflowing RAM when downloading large files because the browser is slower to download the file via Internet than the server via internal network (because the bandwidth).
Does IO.copy
handle back pressure?
I have the impression that yes because I made some tests of simultaneous download, the memory seems ok. Also the data goes through IO::Memory and a small buffer (4kb) but I may be wrong?
It’s just for one endpoint, if IO.copy handles it well it saves me from putting a Nginx proxy in front of it just for this case.
HTTP::Client.get(url) do |res|
if res.status_code != 200
Log.debug { "#{res.status_message}: #{url}" }
return
end
ctx.response.headers.merge!(res.headers)
IO.copy(res.body_io, ctx.response.output) if res.body_io?
end
In this case IO.copy handles the back pressure?