Final Grand Central Dispatch tutorial in Swift


Be taught the rules of multi-threading with the GCD framework in Swift. Queues, duties, teams all the things you will ever want I promise.

iOS

GCD concurrency tutorial for newbies

The Grand Central Dispatch (GCD, or simply Dispatch) framework is predicated on the underlying thread pool design sample. Which means there are a hard and fast variety of threads spawned by the system – primarily based on some elements like CPU cores – they’re at all times out there ready for duties to be executed concurrently. 🚦

Creating threads on the run is an costly job so GCD organizes duties into particular queues, and afterward the duties ready on these queues are going to be executed on a correct and out there thread from the pool. This strategy results in nice efficiency and low execution latency. We are able to say that the Dispatch framework is a really quick and environment friendly concurrency framework designed for contemporary multicore hardwares and wishes.

Concurrency, multi-tasking, CPU cores, parallelism and threads

A processor can run duties made by you programmatically, that is often referred to as coding, growing or programming. The code executed by a CPU core is a thread. So your app goes to create a course of that’s made up from threads. πŸ€“

Up to now a processor had one single core, it might solely cope with one jobΒ at a time. In a while time-slicing was launched, so CPU’s might execute threads concurrently utilizing context switching. As time handed by processors gained extra horse energy and cores in order that they have been able to actual multi-tasking utilizing parallelism. ⏱

These days a CPU is a really highly effective unit, it is able to executing billions of duties (cycles) per second. Due to this excessive availability velocity Intel launched a expertise referred to as hyperthreading. They divided CPU clock cycles between (often two) processes operating on the identical time, so the variety of out there threads primarily doubled. πŸ“ˆ

As you possibly can see concurrenct execution will be achieved with numerous methods, however you needn’t care about that a lot. It is as much as the CPU structure the way it solves concurrency, and it is the working system’s job how a lot thread goes to be spawned for the underlying thread pool. The GCD framework will conceal all of the complexity, however it’s at all times good to grasp the fundamental rules. πŸ‘


Synchronous and asynchronous execution

Every work merchandise will be executed both synchronously or asynchronously.

Have you ever ever heard of blocking and non-blocking code? This is identical situaton right here. With synchronous duties you will block the execution queue, however with async duties your name will immediately return and the queue can proceed the execution of the remaining duties (or work gadgets as Apple calls them). 🚧

Synchronous execution

When a piece merchandise is executed synchronously with the sync methodology, this system waits till execution finishes earlier than the tactic name returns.

Your operate is almost certainly synchronous if it has a return worth, so func load() -> String goes to in all probability block the factor that runs on till the sources is totally loaded and returned again.

Asynchronous execution

When a piece merchandise is executed asynchronously with the async methodology, the tactic name returns instantly.

Completion blocks are sing of async strategies, for instance if you happen to have a look at this methodology func load(completion: (String) -> Void) you possibly can see that it has no return sort, however the results of the operate is handed again to the caller afterward via a block.

It is a typical use case, if you must await one thing inside your methodology like studying the contents of an enormous file from the disk, you do not wish to block your CPU, simply due to the sluggish IO operation. There will be different duties that aren’t IO heavy in any respect (math operations, and so on.) these will be executed whereas the system is studying your file from the bodily arduous drive. πŸ’Ύ

With dispatch queues you possibly can execute your code synchronously or asynchronously. With syncronous execution the queue waits for the work, with async execution the code returns instantly with out ready for the duty to finish. ⚑️


Dispatch queues

As I discussed earlier than, GCD organizes job into queues, these are similar to the queues on the shopping center. On each dispatch queue, duties will likely be executed in the identical order as you add them to the queue – FIFO: the primary job within the line will likely be executed first – however it’s best to word that the order of completion isn’t assured. Duties will likely be accomplished in keeping with the code complexity. So if you happen to add two duties to the queue, a sluggish one first and a quick one later, the quick one can end earlier than the slower one.Β βŒ›οΈ

Serial and concurrent queues

There are two forms of dispatch queues. Serial queues can execute one job at a time, these queues will be utilized to synchronize entry to a selected useful resource. Concurrent queues alternatively can execute a number of duties parallell in the identical time. Serial queue is rather like one line within the mall with one cashier, concurrent queue is like one single line that splits for 2 or extra cashiers. πŸ’°

Principal, world and customized queues

The most important queue is a serial one, each job on the principle queue runs on the principle thread.

World queues are system offered concurrent queues shared via the working system. There are precisely 4 of them organized by excessive, default, low precedence plus an IO throttled background queue.

Customized queues will be created by the consumer. Customized concurrent queues at all times mapped into one of many world queues by specifying a High quality of Service property (QoS). In a lot of the circumstances if you wish to run duties in parallel it is suggested to make use of one of many world concurrent queues, it’s best to solely create customized serial queues.

System offered queues

  • Serial most important queue
  • Concurrent world queues
  • excessive precedence world queue
  • default precedence world queue
  • low precedence world queue
  • world background queue (io throttled)

Customized queues by high quality of service

  • userInteractive (UI updates) -> serial most important queue
  • userInitiated (async UI associated duties) -> excessive precedence world queue
  • default -> default precedence world queue
  • utility -> low precedence world queue
  • background -> world background queue
  • unspecified (lowest)Β -> low precedence world queue

Sufficient from the speculation, let’s have a look at the best way to use the Dispatch framework in motion! 🎬


Tips on how to use the DispatchQueue class in Swift?

Right here is how one can get all of the queues from above utilizing the model new GCD syntax out there from Swift 3. Please word that it’s best to at all times use a worldwide concurrent queue as a substitute of making your personal one, besides if you’re going to use the concurrent queue for locking with limitations to realize thread security, extra on that later. 😳

Tips on how to get a queue?

import Dispatch

DispatchQueue.most important
DispatchQueue.world(qos: .userInitiated)
DispatchQueue.world(qos: .userInteractive)
DispatchQueue.world(qos: .background)
DispatchQueue.world(qos: .default)
DispatchQueue.world(qos: .utility)
DispatchQueue.world(qos: .unspecified)
DispatchQueue(label: "com.theswiftdev.queues.serial")
DispatchQueue(label: "com.theswiftdev.queues.concurrent", attributes: .concurrent)

So executing a job on a background queue and updating the UI on the principle queue after the duty completed is a reasonably straightforward one utilizing Dispatch queues.

DispatchQueue.world(qos: .background).async {
    

    DispatchQueue.most important.async {
        
    }
}

Sync and async calls on queues

There isn’t a massive distinction between sync and async strategies on a queue.Β Sync is simply an async name with a semaphore (defined later) that waits for the return worth. A sync name will block, alternatively an async name will instantly return. πŸŽ‰

let q = DispatchQueue.world()

let textual content = q.sync {
    return "this can block"
}
print(textual content)

q.async {
    print("this can return immediately")
}

Mainly if you happen to want a return worth use sync, however in each different case simply go together with async. DEADLOCK WARNING: it’s best to by no means name sync on the principle queue, as a result of it will trigger a impasse and a crash. You should utilize this snippet in case you are searching for a protected approach to do sync calls on the principle queue / thread. πŸ‘Œ

Do not name sync on a serial queue from the serial queue’s thread!

Delay execution

You possibly can merely delay code execution utilizing the Dispatch framework.

DispatchQueue.most important.asyncAfter(deadline: .now() + .seconds(2)) {
    
}

Carry out concurrent loop

Dispatch queue merely permits you to carry out iterations concurrently.

DispatchQueue.concurrentPerform(iterations: 5) { (i) in
    print(i)
}

Debugging

Oh, by the best way it is only for debugging objective, however you possibly can return the identify of the present queue through the use of this little extension. Don’t use in manufacturing code!!!

extension DispatchQueue {
    static var currentLabel: String {
        return String(validatingUTF8: __dispatch_queue_get_label(nil))!
    }
}

Utilizing DispatchWorkItem in Swift

DispatchWorkItem encapsulates work that may be carried out. A piece merchandise will be dispatched onto a DispatchQueue and inside a DispatchGroup. A DispatchWorkItem will also be set as a DispatchSource occasion, registration, or cancel handler.

So that you similar to with operations through the use of a piece merchandise you possibly can cancel a operating job. Additionally work gadgets can notify a queue when their job is accomplished.

var workItem: DispatchWorkItem?
workItem = DispatchWorkItem {
    for i in 1..<6 {
        guard let merchandise = workItem, !merchandise.isCancelled else {
            print("cancelled")
            break
        }
        sleep(1)
        print(String(i))
    }
}

workItem?.notify(queue: .most important) {
    print("achieved")
}

DispatchQueue.world().asyncAfter(deadline: .now() + .seconds(2)) {
    workItem?.cancel()
}
DispatchQueue.most important.async(execute: workItem!)

Concurrent duties with DispatchGroups

So it’s essential to carry out a number of community calls with the intention to assemble the info required by a view controller? That is the place DispatchGroup might help you. Your whole lengthy operating background job will be executed concurrently, when all the things is prepared you will obtain a notification. Simply watch out you must use thread-safe knowledge constructions, so at all times modify arrays for instance on the identical thread! πŸ˜…

func load(delay: UInt32, completion: () -> Void) {
    sleep(delay)
    completion()
}

let group = DispatchGroup()

group.enter()
load(delay: 1) {
    print("1")
    group.depart()
}

group.enter()
load(delay: 2) {
    print("2")
    group.depart()
}

group.enter()
load(delay: 3) {
    print("3")
    group.depart()
}

group.notify(queue: .most important) {
    print("achieved")
}

Observe that you simply at all times should steadiness out the enter and depart calls on the group. The dispatch group additionally permits us to trace the completion of various work gadgets, even when they run on completely different queues.

let group = DispatchGroup()
let queue = DispatchQueue(label: "com.theswiftdev.queues.serial")
let workItem = DispatchWorkItem {
    print("begin")
    sleep(1)
    print("finish")
}

queue.async(group: group) {
    print("group begin")
    sleep(2)
    print("group finish")
}
DispatchQueue.world().async(group: group, execute: workItem)



group.notify(queue: .most important) {
    print("achieved")
}

Yet one more factor that you should use dispatch teams for: think about that you simply’re displaying a properly animated loading indicator when you do some precise work. It would occurs that the work is completed sooner than you’d anticipate and the indicator animation couldn’t end. To unravel this case you possibly can add a small delay job so the group will wait till each of the duties end. 😎

let queue = DispatchQueue.world()
let group = DispatchGroup()
let n = 9
for i in 0..<n {
    queue.async(group: group) {
        print("(i): Operating async job...")
        sleep(3)
        print("(i): Async job accomplished")
    }
}
group.wait()
print("achieved")

Semaphores

A semaphore) is solely a variable used to deal with useful resource sharing in a concurrent system. It is a actually highly effective object, listed here are a number of vital examples in Swift.

Tips on how to make an async job to synchronous?

The reply is straightforward, you should use a semaphore (bonus level for timeouts)!

enum DispatchError: Error {
    case timeout
}

func asyncMethod(completion: (String) -> Void) {
    sleep(2)
    completion("achieved")
}

func syncMethod() throws -> String {

    let semaphore = DispatchSemaphore(worth: 0)
    let queue = DispatchQueue.world()

    var response: String?
    queue.async {
        asyncMethod { r in
            response = r
            semaphore.sign()
        }
    }
    semaphore.wait(timeout: .now() + 5)
    guard let outcome = response else {
        throw DispatchError.timeout
    }
    return outcome
}

let response = strive? syncMethod()
print(response)

Lock / single entry to a useful resource

If you wish to keep away from race situation you’re in all probability going to make use of mutual exclusion. This could possibly be achieved utilizing a semaphore object, but when your object wants heavy studying functionality it’s best to take into account a dispatch barrier primarily based answer. 😜

class LockedNumbers {

    let semaphore = DispatchSemaphore(worth: 1)
    var components: [Int] = []

    func append(_ num: Int) {
        self.semaphore.wait(timeout: DispatchTime.distantFuture)
        print("appended: (num)")
        self.components.append(num)
        self.semaphore.sign()
    }

    func removeLast() {
        self.semaphore.wait(timeout: DispatchTime.distantFuture)
        defer {
            self.semaphore.sign()
        }
        guard !self.components.isEmpty else {
            return
        }
        let num = self.components.removeLast()
        print("eliminated: (num)")
    }
}

let gadgets = LockedNumbers()
gadgets.append(1)
gadgets.append(2)
gadgets.append(5)
gadgets.append(3)
gadgets.removeLast()
gadgets.removeLast()
gadgets.append(3)
print(gadgets.components)

Anticipate a number of duties to finish

Identical to with dispatch teams, you may also use a semaphore object to get notified if a number of duties are completed. You simply have to attend for it…

let semaphore = DispatchSemaphore(worth: 0)
let queue = DispatchQueue.world()
let n = 9
for i in 0..<n {
    queue.async {
        print("run (i)")
        sleep(3)
        semaphore.sign()
    }
}
print("wait")
for i in 0..<n {
    semaphore.wait()
    print("accomplished (i)")
}
print("achieved")

Batch execution utilizing a semaphore

You possibly can create a thread pool like habits to simulate restricted sources utilizing a dispatch semaphore. So for instance if you wish to obtain plenty of photos from a server you possibly can run a batch of x each time. Fairly useful. πŸ–

print("begin")
let sem = DispatchSemaphore(worth: 5)
for i in 0..<10 {
    DispatchQueue.world().async {
        sem.wait()
        sleep(2)
        print(i)
        sem.sign()
    }
}
print("finish")

The DispatchSource object

A dispatch supply is a basic knowledge sort that coordinates the processing of particular low-level system occasions.

Alerts, descriptors, processes, ports, timers and plenty of extra. All the pieces is dealt with via the dispatch supply object. I actually do not wish to get into the small print, it is fairly low-level stuff. You possibly can monitor information, ports, alerts with dispatch sources. Please simply learn the offical Apple docs. πŸ“„

I would prefer to make just one instance right here utilizing a dispatch supply timer.

let timer = DispatchSource.makeTimerSource()
timer.schedule(deadline: .now(), repeating: .seconds(1))
timer.setEventHandler {
    print("good day")
}
timer.resume()

Thread-safety utilizing the dispatch framework

Thread security is an inevitable matter if it involves multi-threaded code. To start with I discussed that there’s a thread pool below the hood of GCD. Each thread has a run loop object related to it, you possibly can even run them by hand. When you create a thread manually a run loop will likely be added to that thread robotically.

let t = Thread {
    print(Thread.present.identify ?? "")
     let timer = Timer(timeInterval: 1, repeats: true) { t in
         print("tick")
     }
     RunLoop.present.add(timer, forMode: .defaultRunLoopMode)

    RunLoop.present.run()
    RunLoop.present.run(mode: .commonModes, earlier than: Date.distantPast)
}
t.identify = "my-thread"
t.begin()

You shouldn’t do that, demo functions solely, at all times use GCD queues!

Queue != Thread

A GCD queue isn’t a thread, if you happen to run a number of async operations on a concurrent queue your code can run on any out there thread that matches the wants.

Thread security is all about avoiding tousled variable states

Think about a mutable array in Swift. It may be modified from any thread. That is not good, as a result of ultimately the values inside it are going to be tousled like hell if the array isn’t thread protected. For instance a number of threads are attempting to insert values to the array. What occurs? In the event that they run in parallell which aspect goes to be added first? Now that is why you want typically to create thread protected sources.

Serial queues

You should utilize a serial queue to implement mutual exclusivity. All of the duties on the queue will run serially (in a FIFO order), just one course of runs at a time and duties have to attend for one another. One massive draw back of the answer is velocity. 🐌

let q = DispatchQueue(label: "com.theswiftdev.queues.serial")

q.async() {
  
}

q.sync() {
  
}

Concurrent queues utilizing limitations

You possibly can ship a barrier job to a queue if you happen to present an additional flag to the async methodology. If a job like this arrives to the queue it will make sure that nothing else will likely be executed till the barrier job have completed. To sum this up, barrier duties are sync (factors) duties for concurrent queues. Use async limitations for writes, sync blocks for reads. 😎

let q = DispatchQueue(label: "com.theswiftdev.queues.concurrent", attributes: .concurrent)

q.async(flags: .barrier) {
  
}

q.sync() {
  
}

This methodology will lead to extraordinarily quick reads in a thread protected atmosphere. You may also use serial queues, semaphores, locks all of it depends upon your present state of affairs, however it’s good to know all of the out there choices is not it? 🀐


A number of anti-patterns

You must be very cautious with deadlocks, race situations and the readers writers downside. Often calling the sync methodology on a serial queue will trigger you a lot of the troubles. One other concern is thread security, however we have already lined that half. πŸ˜‰

let queue = DispatchQueue(label: "com.theswiftdev.queues.serial")

queue.sync {
    
    queue.sync {
        
    }
}


DispatchQueue.world(qos: .utility).sync {
    
    DispatchQueue.most important.sync {
        
    }
}

The Dispatch framework (aka. GCD) is a tremendous one, it has such a possible and it actually takes a while to grasp it. The true query is that what path goes to take Apple with the intention to embrace concurrent programming into a complete new degree? Guarantees or await, perhaps one thing totally new, let’s hope that we’ll see one thing in Swift 6.