Ulimit means user limit. So if you have multiple processes running under that same user, they all add up to the number of open files. ulimit -n
= 256 seems like a very low setting. Especially if you intend to do mass file operations. Not sure what you’re working on (maybe it’s a shared system?), but if you can, you should consider increasing that.
So increasing the ulimit would probably help to remove the immediate problem, but it’s actually a sign there may be issues with you code.
It seems you’re essentially starting a process for every file in the folder concurrently. That works for smaller amounts, but is not really efficient at scale. As @Blacksmoke16 already suggested, you should limit the amount of concurrently running jobs to keep the number of simultaneously opened files low. If the number of concurrent uploads exeeds the upload capacitry, they’re all fighting for the available bandwidth anyways. So it’s better to limit the number of concurrent uploads in the first place.