Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
import os.log
func asynchronousExperiment() {
func asynchronousOperation(index: Int, completion: @escaping () -> ()) {
os_log("STARTING %d", index)
DispatchQueue.main.asyncAfter(deadline: .now() + 5) {
os_log("FINISHED %d", index)
completion()
}
}
// If using semaphores, you want to make sure you `wait` on a different thread, generally a background thread.
// I'll use global worker threads.
DispatchQueue.global().async {
let semaphore = DispatchSemaphore(value: 4)
for i in 0 ..< 100 {
semaphore.wait()
asynchronousOperation(index: i) {
semaphore.signal()
}
}
}
}
@robertmryan

This comment has been minimized.

Copy link
Owner Author

commented Jul 29, 2019

This is designed for controlling the degree of concurrency. In this example, while there are 100 asynchronous tasks, this won’t run more than four at any given time. This is very useful to (a) avoid exhausting the limited number of worker threads; and (b) to constrain peak memory usage, especially in case the asynchronousOperation was doing something memory intensive (e.g. downloading and manipulating large images).

Frankly, we’d often reach for OperationQueue and its maxConcurrentOperationCount when controlling the degree of concurrency, but the above is a GCD solution for the same issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.