One needs to send 1 mln HTTP requests concurrently, in batches, and read the responses. No more than 100 requests at a time.
Which way will it be better, recommended, idiomatic?
-
Send 100 ones, wait for them to finish, send another 100, wait for them to finish… and so on
-
Send 100 ones. As a a request among the 100 finishes, add a new one into the pool. “Done - add a new one. Done - add a new one”. As a stream.
That’s not 1M concurrent requests.
That’s 100 concurrent requests for a queue of 1M tasks.
Work queue and thread pool is the normal way, but it’s possible to get fancy with optimizations.
Basically you fire 100 requests and when one completes you immediately fire another.