GCD Part 1: Queues and methods

Alex Shchukin
5 min readMay 26, 2021

I would like to start a series of articles about Grand Central Dispatch (GCD). GCD or libdispatch is one the most popular instruments for multithreading programming in iOS and macOS. It’s a library written in C to ease thread management. Instead of the manual creation of threads and their subsequent control, we can use abstract queues and put all the responsibility of thread management on them.

In the series, we will cover basic primitives like queues, how to work with them, research dispatch source, and touch on DispatchIO (which is not a super popular tool). We will try to implement some basic approaches that we can use in real-world applications. And for the most curious, we will try to implement GCD primitives ourselves.

Dispatch queues

In this first article, I’ll explain dispatch queues and how to work with them. Basically, a queue is based on the same principles as FIFO queue (one of the classical data structure primitives).

Here is how we can create a serial queue. As shown in the code below, a serial queue is created by default without any specification.

let serialQueue = DispatchQueue(label: “com.test.serial”)

In contrast, a concurrent queue executes in parallel. You create the concurrent queue by setting attributes parameter to concurrent.

let concurrentQueue = DispatchQueue(label: “com.test.concurrent”, attributes: .concurrent)

It’s important to understand the relation between queues and threads. First of all, a queue is an abstraction around the threads. There is a thread pool that is used by queues so each queue performs its tasks on the threads from that thread pool. A serial queue is limited by using only one arbitrary thread and in contrast, a concurrent queue is available to use multiple threads for its tasks. Let’s consider a situation where we split our work into different pieces and run them on the concurrent queue. A concurrent queue will execute the tasks in the different threads. Since that the core can perform only one thread at a time we are quite limited in terms of parallel execution. This case is called Thread explosion. It’s very heavy performance-wise and in the worst case, it can cause deadlock. That means we should be very careful with the usage of the concurrent queues and do not overload them with a big amount of tasks. Another very good practice is to limit the number of serial queues and use target queue hierarchy per subsystem. We will take a close look at the target queue hierarchy in the following article.

The label parameter used in both scenarios is a unique string identifier. It helps to find the queue in different debug tools. Since GCD queues are used through different frameworks, it is recommended you choose a reverse-DNS style.

There is also a possibility to fetch a queue from a pool of queues. These queues are created by an OS and can be used for system tasks. For heavy tasks, it is better to create your own queues instead of using global ones.

let globalQueue = DispatchQueue.global()

All global queues are concurrent but there is one exception in that rule — the main queue. This queue is serial and all the tasks that are queued on it are executed in the main thread.

let mainQueue = DispatchQueue.main

Async vs Sync

Let’s discuss how to use queues. Async and sync are two basic methods that we can use to interact with queues. Sync waits until the task finishes and async returns control of execution after it starts the task. In the example below, you can see how async and sync work for different types of queues.

The serial queue:

serialQueue.async {
print("test1")
}
serialQueue.async {
sleep(1)
print("test2")
}
serialQueue.sync {
print("test3")
}
serialQueue.sync {
print("test4")
}

Result:

test1
test2
test3
test4

Let’s consider what happens if you try to call the sync method inside of the sync method in the same serial queue. The task will be added to the queue and the queue will wait until the task is finished. Inside the task, another sync block will be caused. But it will not start until the serial queue finishes the current task. So we are coming to a situation where the tasks block each other. This situation is called deadlock and we will look at it in the following articles.

// Cause deadlock
serialQueue.sync {
serialQueue.sync {
print(“test”)
}
}

Ok, since the main queue is a serial queue we come to another rule — you should not call sync from the main queue in the main thread. The idea is pretty much the same as in the previous paragraph. Task called from the main queue awaits because the main queue can’t finish the current task.

// Cause deadlock
DispatchQueue.main.sync {
print(“test”)
}

In the concurrent queue example we can only guarantee that test3 will be printed after test4:

concurrentQueue.async {
print(“test1”)
}
concurrentQueue.async {
print(“test2”)
}
concurrentQueue.sync {
print(“test3”)
}
concurrentQueue.sync {
print(“test4”)
}

Result:

test2
test1
test3
test4

or

test1
test3
test2
test4

As you can see the order of the printing is arbitrary except that test3 will be printed before test4.

Method asyncAfter

If we want to delay the execution of the task we can use another method called asyncAfter. This method returns the control of the execution to the calling thread and will execute the task at a certain moment in time. In the example below the task will be executed 3 seconds after it is added to the queue.

concurrentQueue.asyncAfter(deadline: .now() + 3, execute: {
print(“test”)
})

Result:

<- 3 seconds wait time ->
test

Let’s consider another situation where we execute a long-term task on a serial queue. So we schedule the task for a certain period of time but the long-term task is not finished yet. In this scenario, asyncAfter will wait until the long-term task finishes then will execute its tasks thereafter.

serialQueue.async {
sleep(3)
print(“finish”)
}
serialQueue.asyncAfter(deadline: .now() + 1, execute: {
print(“test”)
})

Result:

<- 3 seconds wait time ->
finish
test

Ok, we learned what are the basic primitives (the queues) in GCD and how to use them. It’s very important to understand these concepts because all the GCD functionalities are based on them. It was the first part of the series of articles and in the next article, we will look at QOS(quality of service). I’ll explain how it works and we will run some examples.

--

--