Although Grand Central Dispatch (or GCD for short) has been around for a while, not everyone knows how to get the most out of it. This is understandable; concurrency is tricky, and GCD’s C-based API can seem like a set of pointy corners poking into the smooth world of Objective-C. Learn about Grand Central Dispatch in-depth in this two part tutorial series.
In this two-part series, the first tutorial explains what GCD does and showcases several of the more basic GCD functions. In the second part, you’ll learn several of the more advanced functions GCD has to offer.
What is GCD?
GCD is the marketing name for libdispatch, Apple’s library that provides support for concurrent code execution on multicore hardware on iOS and OS X. It offers the following benefits:
- GCD can improve your app’s responsiveness by helping you defer computationally expensive tasks and run them in the background.
- GCD provides an easier concurrency model than locks and threads and helps to avoid concurrency bugs.
- GCD can potentially optimize your code with higher performance primitives for common patterns such as singletons.
This tutorial assumes that you have a basic understanding of working with blocks and GCD. If you’re brand-new to GCD, check out Multithreading and Grand Central Dispatch on iOS for Beginners to learn the essentials.
GCD Terminology
To understand GCD, you need to be comfortable with several concepts related to threading and concurrency. These can be both vague and subtle, so take a moment to review them briefly in the context of GCD.
Serial vs. Concurrent
These terms describe when tasks are executed with respect to each other. Tasks executed serially are always executed one at a time. Tasks executed concurrently might be executed at the same time.
Although these terms have wide application, for the purposes of this tutorial you can consider a task to be an Objective-C block. Don’t know what a block is? Check out the How to Use Blocks in iOS 5 Tutorial. In fact, you can also use GCD with function pointers, but in most cases this is substantially more tricky to use. Blocks are just easier!
Synchronous vs. Asynchronous
Within GCD, these terms describe when a function completes with respect to another task that the function asks GCD to perform. A synchronous function returns only after the completion of a task that it orders.
An asynchronous function, on the other hand, returns immediately, ordering the task to be done but does not wait for it. Thus, an asynchronous function does not block the current thread of execution from proceeding on to the next function.
Be careful — when you read that a synchronous function “blocks” the current thread, or that the function is a “blocking” function or blocking operation, don’t get confused! The verb blocks describes how a function affects its own thread and has no connection to the noun block, which describes an anonymous function literal in Objective-C and defines a task submitted to GCD.
Critical Section
This is a piece of code that must not be executed concurrently, that is, from two threads at once. This is usually because the code manipulates a shared resource such as a variable that can become corrupt if it’s accessed by concurrent processes.
Race Condition
This is a situation where the behavior of a software system depends on a specific sequence or timing of events that execute in an uncontrolled manner, such as the exact order of execution of the program’s concurrent tasks. Race conditions can produce unpredictable behavior that aren’t immediately evident through code inspection.
Deadlock
Two (or sometimes more) items — in most cases, threads — are said to be deadlocked if they all get stuck waiting for each other to complete or perform another action. The first can’t finish because it’s waiting for the second to finish. But the second can’t finish because it’s waiting for the first to finish.
Thread Safe
Thread safe code can be safely called from multiple threads or concurrent tasks without causing any problems (data corruption, crashing, etc). Code that is not thread safe must only be run in one context at a time. An example of thread safe code is NSDictionary
. You can use it from multiple threads at the same time without issue. On the other hand, NSMutableDictionary
is not thread safe and should only be accessed from one thread at a time.
Context Switch
A context switch is the process of storing and restoring execution state when you switch between executing different threads on a single process. This process is quite common when writing multitasking apps, but comes at a cost of some additional overhead.
Concurrency vs Parallelism
Concurrency and parallelism are often mentioned together, so it’s worth a short explanation to distinguish them from each other.
Separate parts of concurrent code can be executed “simultaneously”. However, it’s up to the system to decide how this happens — or if it happens at all.
Multi-core devices execute multiple threads at the same time via parallelism; however, in order for single-cored devices to achieve this, they must run a thread, perform a context switch, then run another thread or process. This usually happens quickly enough as to give the illusion of parallel execution as shown by the diagram below:
Although you may write your code to use concurrent execution under GCD, it’s up to GCD to decide how much parallelism is required. Parallelism requires concurrency, but concurrency does not guarantee parallelism.
The deeper point here is that concurrency is actually about structure. When you code with GCD in mind, you structure your code to expose the pieces of work that can run simultaneously, as well as the ones that must not be run simulataneously. If you want to delve more deeply into this subject, check out this excellent talk by Rob Pike.
Queues
GCD provides dispatch queues to handle blocks of code; these queues manage the tasks you provide to GCD and execute those tasks in FIFO order. This guarantees that first task added to the queue is the first task started in the queue, the second task added will be the second to start, and so on down the line.
All dispatch queues are themselves thread-safe in that you can access them from multiple threads simultaneously. The benefits of GCD are apparent when you understand how dispatch queues provide thread-safety to parts of your own code. The key to this is to choose the right kind of dispatch queue and the right dispatching function to submit your work to the queue.
In this section you’ll take a look at the two kinds of dispatch queues, the particular queues GCD offers, and then work through some examples that illustrate how to add work to the queues with the GCD dispatching functions.
Serial Queues
Tasks in serial queues execute one at a time, each task starting only after the preceding task has finished. As well, you won’t know the amount of time between one block ending and the next one beginning, as shown in the diagram below:
The execution timing of these tasks is under the control of GCD; the only thing you’re guaranteed to know is that GCD executes only one task at a time and that it executes the tasks in the order they were added to the queue.
Since no two tasks in a serial queue can ever run concurrently, there is no risk they might access the same critical section concurrently; that protects the critical section from race conditions with respect to those tasks only. So if the only way to access that critical section is via a task submitted to that dispatch queue, then you can be sure that the critical section is safe.
Concurrent Queues
Tasks in concurrent queues are guaranteed to start in the order they were added…and that’s about all you’re guaranteed! Items can finish in any order and you have no knowledge of the time it will take for the next block to start, nor the number of blocks that are running at any given time. Again, this is entirely up to GCD.
The diagram below shows a sample task execution plan of four concurrent tasks under GCD:
Notice how Block 1, 2, and 3 all ran quickly, one after another, while it took a while for Block 1 to start after Block 0 started. Also, Block 3 started after Block 2 but finished first.
The decision of when to start a block is entirely up to GCD. If the execution time of one block overlaps with another, it’s up to GCD to determine if it should run on a different core, if one is available, or instead to perform a context switch to a different block of code.
Just to make things interesting, GCD provides you with at least five particular queues to choose from within each queue type.
Queue Types
First, the system provides you with a special serial queue known as the main queue. Like any serial queue, tasks in this queue execute one at a time. However, it’s guaranteed that all tasks will execute on the main thread, which is the only thread allowed to update your UI. This queue is the one to use for sending messages to UIViews
or posting notifications.
The system also provides you with several concurrent queues. These are known as the Global Dispatch Queues. There are currently four global queues of different priority: background, low, default, and high. Be aware that Apple’s APIs also uses these queues, so any tasks you add won’t be the only ones on these queues.
Finally, you can also create your own custom serial or concurrent queues. That means you have at least five queues at your disposal: the main queue, four global dispatch queues, plus any custom queues that you add to the mix!
And that’s the big picture of dispatch queues!
The “art” of GCD comes down to choosing the right queue dispatching function to submit your work to the queue. The best way to experience this is to work through the examples below, where we’ve provided some general recommendations along the way.
Getting Started
Since the goal of this tutorial is to optimize as well as safely call code from different threads using GCD, you’ll start with an almost-finished project named GooglyPuff.
GooglyPuff is a non-optimized, threading-unsafe app that overlays googly eyes on detected faces using Core Image’s face detection API. For the base image you can select any from the Photo Library or from a set of predefined URL images downloaded from the internet.
Once you’ve downloaded the project, extract it to a convenient location, then open it up in Xcode and build and run. The app will look like the following:
Notice when you choose the Le Internet option to download pictures, a UIAlertView
pops up prematurely. You’ll fix this in the second part of this series.
There are four classes of interest in this project:
- PhotoCollectionViewController: This is the first view controller that starts the app. It showcases all the selected photos through their thumbnails.
- PhotoDetailViewController: This performs the logic to add googly eyes to the image and to display the resulting image in a UIScrollView.
- Photo: This is a class cluster which instantiates photos from an instance of
NSURL
or from an instance ofALAsset
. This class provides an image, thumbnail, and a status when downloading from a URL. - PhotoManager: This manages all the instances of
Photo
.
Handling Background Tasks with dispatch_sync
Head back to the app and add some photos from your Photo Library or use the Le Internet option to download a few.
Notice how long it takes for a new PhotoDetailViewController
to instantiate after clicking on a UICollectionViewCell
in the PhotoCollectionViewController
; there’s a noticeable lag, especially when viewing large images on slower devices.
It’s often easy to overload UIViewController’s
viewDidLoad
with too much clutter; this often results in longer waits before the view controller appears. If possible, it’s best to offload some work to be done in the background if it’s not absolutely essential at load time.
This sounds like a job for dispatch_async!
Open PhotoDetailViewController and replace viewDidLoad with the following implementation:
- (void)viewDidLoad { [super viewDidLoad]; NSAssert(_image, @"Image not set; required to use view controller"); self.photoImageView.image = _image; //Resize if neccessary to ensure it's not pixelated if (_image.size.height <= self.photoImageView.bounds.size.height && _image.size.width <= self.photoImageView.bounds.size.width) { [self.photoImageView setContentMode:UIViewContentModeCenter]; } dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{ // 1 UIImage *overlayImage = [self faceOverlayImageFromImage:_image]; dispatch_async(dispatch_get_main_queue(), ^{ // 2 [self fadeInNewImage:overlayImage]; // 3 }); }); } |
Here’s what’s going on in the modified code above:
- You first move the work off of the main thread and onto a global queue. Because this is a
dispatch_async()
, the block is submitted asynchronously meaning that execution of the calling thread continues. This letsviewDidLoad
finish earlier on the main thread and makes the loading feel more snappy. Meanwhile, the face detection processing is started and will finish at some later time. - At this point, the face detection processing is complete and you’ve generated a new image. Since you want to use this new image to update your
UIImageView
, you add a new block of work to the main queue. Remember – you must always access UI Kit classes on the main thread! - Finally, you update the UI with fadeInNewImage: which performs a fade-in transition of the new googly eyes image.
Build and run your app; select an image and you’ll notice that the view controller loads up noticeably faster and adds the googly eyes after a short delay. This lends a nice effect to the app as you show the before and after photo for maximum impact.
As well, if you tried to load an insanely huge image, the app wouldn’t hang in the process of loading the view controller, which allows the app to scale well.
As mentioned above, dispatch_async
appends a block onto a queue and returns immediately. The task will then be executed at some later time as decided by GCD. Use dispatch_async
when you need to perform a network-based or CPU intensive task in the background and not block the current thread.
Here’s a quick guide of how and when to use the various queue types with dispatch_async
:
- Custom Serial Queue: A good choice when you want to perform background work serially and track it. This eliminates resource contention since you know only one task at a time is executing. Note that if you need the data from a method, you must inline another block to retrieve it or consider using
dispatch_sync
. - Main Queue (Serial): This is a common choice to update the UI after completing work in a task on a concurrent queue. To do this, you’ll code one block inside another. As well, if you’re in the main queue and call
dispatch_async
targeting the main queue, you can guarantee that this new task will execute sometime after the current method finishes. - Concurrent Queue: This is a common choice to perform non-UI work in the background.
Delaying Work with dispatch_after
Consider the UX of your app for a moment. It’s possible that users might be confused about what to do when the open the app for the first time — were you? :]
It would be a good idea to display a prompt to the user if there aren’t any photos in the PhotoManager class. However, you also need to think about how the user’s eyes will navigate the home screen: if you display a prompt too quickly, they might miss it as their eyes lingered on other parts of the view.
A one-second delay before displaying the prompt should be enough to catch the user’s attention as they get their first look at the app.
Add the following code to the stubbed-out implementation of showOrHideNavPrompt
in PhotoCollectionViewController.m:
- (void)showOrHideNavPrompt { NSUInteger count = [[PhotoManager sharedManager] photos].count; double delayInSeconds = 1.0; dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, (int64_t)(delayInSeconds * NSEC_PER_SEC)); // 1 dispatch_after(popTime, dispatch_get_main_queue(), ^(void){ // 2 if (!count) { [self.navigationItem setPrompt:@"Add photos with faces to Googlyify them!"]; } else { [self.navigationItem setPrompt:nil]; } }); } |
showOrHideNavPrompt
executes in viewDidLoad
and anytime your UICollectionView
is reloaded. Taking each numbered comment in turn:
- You declare the variable that specifies the amount of time to delay.
- You then wait for the amount of time given in the
delayInSeconds
variable and then asynchronously add the block to the main queue.
Build and run the app. There should be a slight delay, which will hopefully grab the user’s attention and show them what to do.
dispatch_after
works just like a delayed dispatch_async
. You still have no control over the actual time of execution nor can you cancel this once dispatch_after
returns.
Wondering when it’s appropriate to use dispatch_after
?
- Custom Serial Queue: Use caution when using
dispatch_after
on a custom serial queue. You’re better off sticking to the main queue. - Main Queue (Serial): This is a good choice for
dispatch_after
; Xcode has a nice autocomplete template for this. - Concurrent Queue: Use caution when using
dispatch_after
on custom concurrent queues; it’s rare that you’ll do this. Stick to the main queue for these operations.
Making Your Singletons Thread-Safe
Singletons. Love them or hate them, they’re as popular in iOS as cats are on the web. :]
One frequent concern with singletons is that often they’re not thread safe. This concern is well-justified given their use: singletons are often used from multiple controllers accessing the singleton instance at the same time.
Threading concerns for singletons range from initiation, to reads and writes of information.
The PhotoManager
class has been implemented as a singleton — and it suffers from these issues in its current state. To see how things can go wrong really quickly, you’ll create a controlled race condition on the singleton instance.
Navigate to PhotoManager.m and find sharedManager
; it will look like the code below:
+ (instancetype)sharedManager { static PhotoManager *sharedPhotoManager = nil; if (!sharedPhotoManager) { sharedPhotoManager = [[PhotoManager alloc] init]; sharedPhotoManager->_photosArray = [NSMutableArray array]; } return sharedPhotoManager; } |
The code is rather simple in its current state; you create a singleton and instantiate a private NSMutableArray
property named photosArray
.
However, the if
condition branch is not thread safe; if you invoke this method multiple times, there’s a possibility that one thread (call it Thread-A) could enter the if
block and a context switch could occur before sharedPhotoManager
is allocated. Then another thread (Thread-B) could enter the if
, allocate an instance of the singleton, then exit.
When the system context switches back to Thread-A, you’ll then allocate another instance of the singleton, then exit. At that point you have two instances of a singleton — which is not what you want!
To force this condition to happen, replace sharedManager
in PhotoManager.m with the following implementation:
+ (instancetype)sharedManager { static PhotoManager *sharedPhotoManager = nil; if (!sharedPhotoManager) { [NSThread sleepForTimeInterval:2]; sharedPhotoManager = [[PhotoManager alloc] init]; NSLog(@"Singleton has memory address at: %@", sharedPhotoManager); [NSThread sleepForTimeInterval:2]; sharedPhotoManager->_photosArray = [NSMutableArray array]; } return sharedPhotoManager; } |
In the code above you’re forcing a context switch to happen with NSThread’s
sleepForTimeInterval:
class method.
Open AppDelegate.m and add the following code to the very beginning of application:didFinishLaunchingWithOptions:
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{ [PhotoManager sharedManager]; }); dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{ [PhotoManager sharedManager]; }); |
This creates multiple asynchronous concurrent calls to instantiate the singleton and invoke the race condition as described above.
Build and run your project; check the console output and you’ll see multiple singletons instantiated, as shown below:
Notice that there are several lines all showing different addresses of the singleton instance. That defeats the purpose of a singleton, doesn’t it? :]
This output shows you that the critical section executed several times when it should have only have executed once. Now, admittedly, you forced this situation to happen, but you can imagine how this condition could occur unintentionally as well.
NSLog
s will show up on occasion. Threading issues can be extremely hard to debug since they tend to be hard to reproduce.To correct this condition, the instantiation code should only execute once and block other instances from running while it is in the critical section of the if
condition. This is exactly what dispatch_once does.
Replace the conditional if
check with dispatch_once
in the singleton initialization method as shown below:
+ (instancetype)sharedManager { static PhotoManager *sharedPhotoManager = nil; static dispatch_once_t onceToken; dispatch_once(&onceToken, ^{ [NSThread sleepForTimeInterval:2]; sharedPhotoManager = [[PhotoManager alloc] init]; NSLog(@"Singleton has memory address at: %@", sharedPhotoManager); [NSThread sleepForTimeInterval:2]; sharedPhotoManager->_photosArray = [NSMutableArray array]; }); return sharedPhotoManager; } |
Build and run your app; check the console output and you’ll now see one and only one instantiation of the singleton — which is what you’d expect for a singleton! :]
Now that you understand the importance of preventing race conditions, remove the dispatch_async statements from AppDelegate.m and replace PhotoManager‘s singleton initialization with the following implementation:
+ (instancetype)sharedManager { static PhotoManager *sharedPhotoManager = nil; static dispatch_once_t onceToken; dispatch_once(&onceToken, ^{ sharedPhotoManager = [[PhotoManager alloc] init]; sharedPhotoManager->_photosArray = [NSMutableArray array]; }); return sharedPhotoManager; } |
dispatch_once()
executes a block once and only once in a thread safe manner. Different threads that try to access the critical section — the code passed to dispatch_once
— while a thread is already in this section are blocked until the critical section completes.
It should be noted that this just makes access to the shared instance thread safe. It does not make the class thread safe, necessarily. You could still have other critical sections in the class, for instance anything that manipulates internal data. Those would need to be made thread safe in other ways, such as synchronising access to the data, as you’ll see in the following sections.
Handling the Readers and Writers Problem
Thread-safe instantiation is not the only issue when dealing with singletons. If the singleton property represents a mutable object, then you need to consider whether that object is itself thread-safe.
If the object in question is a Foundation container class, then the answer is — “probably not”! Apple maintains a helpful and somewhat chilling list of the numerous Foundation classes which are not thread-safe. NSMutableArray
, used by your singleton, is right there among the rest.
Although many threads can read an instance of NSMutableArray
simultaneously without issue, it’s not safe to let one thread modify the array while another is reading it. Your singleton doesn’t prevent this condition from happening in its current state.
To see the problem, have a look at addPhoto:
in PhotoManager.m
, which has been reproduced below:
- (void)addPhoto:(Photo *)photo { if (photo) { [_photosArray addObject:photo]; dispatch_async(dispatch_get_main_queue(), ^{ [self postContentAddedNotification]; }); } } |
This is a write method as it modifies a private mutable array object.
Now take a look at photos
, reproduced below:
This is termed a read method as it’s reading the mutable array. It makes an immutable copy for the caller in order to defend against the caller mutating the array inappropriately, but none of this provides any protection against one thread calling the the write method addPhoto:
while simultaneously another thread calls the read method photos
.
This is the classic software development Readers-Writers Problem. GCD provides an elegant solution of creating a Readers-writer lock using dispatch barriers.
Dispatch barriers are a group of functions acting as a serial-style bottleneck when working with concurrent queues. Using GCD’s barrier API ensures that the submitted block is the only item executed on the specified queue for that particular time. This means that all items submitted to the queue prior to the dispatch barrier must complete before the block will execute.
When the block’s turn arrives, the barrier executes the block and ensures that the queue does not execute any other blocks during that time. Once finished, the queue returns to its default implementation. GCD provides both synchronous and asynchronous barrier functions.
The diagram below illustrates the effect of barrier functions on various asynchronous blocks:
Notice how in normal operation the queue acts just like a normal concurrent queue. But when the barrier is executing, it essentially acts like a serial queue. That is, the barrier is the only thing executing. After the barrier finishes, the queue goes back to being a normal concurrent queue.
Here’s when you would — and wouldn’t — use barrier functions:
- Custom Serial Queue: A bad choice here; barriers won’t do anything helpful since a serial queue executes one operation at a time anyway.
- Global Concurrent Queue: Use caution here; this probably isn’t the best idea since other systems might be using the queues and you don’t want to monopolize them for your own purposes.
- Custom Concurrent Queue: This is a great choice for atomic or critical areas of code. Anything you’re setting or instantiating that needs to be thread safe is a great candidate for a barrier.
Since the only decent choice above is the custom concurrent queue, you’ll need to create one of your own to handle your barrier function and separate the read and write functions. The concurrent queue will allow multiple read operations simultaneously.
Open PhotoManager.m, and add the following private property to the class extension category:
@interface PhotoManager () @property (nonatomic,strong,readonly) NSMutableArray *photosArray; @property (nonatomic, strong) dispatch_queue_t concurrentPhotoQueue; ///< Add this @end |
Find addPhoto:
and replace it with the following implementation:
- (void)addPhoto:(Photo *)photo { if (photo) { // 1 dispatch_barrier_async(self.concurrentPhotoQueue, ^{ // 2 [_photosArray addObject:photo]; // 3 dispatch_async(dispatch_get_main_queue(), ^{ // 4 [self postContentAddedNotification]; }); }); } } |
Here’s how your new write function works:
- Check that there’s a valid photo before performing all the following work.
- Add the write operation using your custom queue. When the critical section executes at a later time this will be the only item in your queue to execute.
- This is the actual code which adds the object to the array. Since it’s a barrier block, this block will never run simultaneously with any other block in
concurrentPhotoQueue
. - Finally you post a notification that you’ve added the image. This notification should be posted from the main thread because it will do UI work, so here you dispatch another task asynchronously to the main queue for the notification.
This takes care of the write, but you also need to implement the photos read method and instantiate concurrentPhotoQueue
.
To ensure thread safety with the writer side of matters, you need to perform the read on the concurrentPhotoQueue
queue. You need to return from the function though, so you can’t dispatch asynchronously to the queue because that wouldn’t necessarily run before the reader function returns.
In this case, dispatch_sync
would be an excellent candidate.
dispatch_sync()
synchronously submits work and waits for it to be completed before returning. Use dispatch_sync
to track of your work with dispatch barriers, or when you need to wait for the operation to finish before you can use the data processed by the block. If you’re working with the second case, you’ll sometimes see a __block
variable written outside of the dispatch_sync
scope in order to use the processed object returned outside the dispatch_sync
function.
You need to be careful though. Imagine if you call dispatch_sync
and target the current queue you’re already running on. This will result in a deadlock because the call will wait to until the block finishes, but the block can’t finish (it can’t even start!) until the currently executing task is finished, which can’t! This should force you to be conscious of which queue you’re calling from — as well as which queue you’re passing in.
Here’s a quick overview of when and where to use dispatch_sync
:
- Custom Serial Queue: Be VERY careful in this situation; if you’re running in a queue and call
dispatch_sync
targeting the same queue, you will definitely create a deadlock. - Main Queue (Serial): Be VERY careful for the same reasons as above; this situation also has potential for a deadlock condition.
- Concurrent Queue: This is a good candidate to sync work through dispatch barriers or when waiting for a task to complete so you can perform further processing.
Still working in PhotoManager.m, replace photos
with the following implementation:
- (NSArray *)photos { __block NSArray *array; // 1 dispatch_sync(self.concurrentPhotoQueue, ^{ // 2 array = [NSArray arrayWithArray:_photosArray]; // 3 }); return array; } |
Here’s your read function. Taking each numbered comment in turn, you’ll find the following:
- The
__block
keyword allows objects to be mutable inside a block. Without this,array
would be read-only inside the block and your code wouldn’t even compile. - Dispatch synchronously onto the
concurrentPhotoQueue
to perform the read. - Store the photo array in
array
and return it.
Finally, you need to instantiate your concurrentPhotoQueue
property. Change sharedManager
to instantiate the queue like so:
+ (instancetype)sharedManager { static PhotoManager *sharedPhotoManager = nil; static dispatch_once_t onceToken; dispatch_once(&onceToken, ^{ sharedPhotoManager = [[PhotoManager alloc] init]; sharedPhotoManager->_photosArray = [NSMutableArray array]; // ADD THIS: sharedPhotoManager->_concurrentPhotoQueue = dispatch_queue_create("com.selander.GooglyPuff.photoQueue", DISPATCH_QUEUE_CONCURRENT); }); return sharedPhotoManager; } |
This initializes concurrentPhotoQueue
as a concurrent queue using dispatch_queue_create
. The first parameter is a reversed DNS style naming convention; make sure it’s descriptive since this can be helpful when debugging. The second parameter specifies whether you want your queue to be serial or concurrent.
0
or NULL
as the second parameter of dispatch_queue_create
. This is a dated way of creating a serial dispatch queue; it’s always better to be specific with your parameters.Congratulations — your PhotoManager
singleton is now thread safe. No matter where or how you read or write photos, you can be confident that it will be done in a safe manner with no amusing surprises.
A Visual Review of Queueing
Still not 100% sure on the essentials of GCD? Make sure you’re comfortable with the basics by creating simple examples yourself using GCD functions using breakpoints and NSLog
statements to make sure you understand what is happening.
I’ve provided two animated GIFs below to help cement your understanding of dispatch_async
and dispatch_sync
. The code is included above each GIF as a visual aid; pay attention to each step of the GIF showing the breakpoint in the code on the left and the related queue state on the right.
dispatch_sync Revisited
- (void)viewDidLoad { [super viewDidLoad]; dispatch_sync(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{ NSLog(@"First Log"); }); NSLog(@"Second Log"); } |
Here’s your guide to the various states of the diagram:
- The main queue chugs along executing tasks in order — up next is a task to instantiate
UIViewController
which includesviewDidLoad
. viewDidLoad
executes on the main thread.- The main thread is currently inside
viewDidLoad
and is just about to reachdispatch_sync
. - The
dispatch_sync
block is added to a global queue and will execute at a later time. Processes are halted on the main thread until the block completes. Meanwhile, the global queue is concurrently processing tasks; recall that blocks will be dequeued in FIFO order on a global queue but can be executed concurrently. - The global queue processes the tasks that were already present on the queue before the
dispatch_sync
block was added. - Finally, the
dispatch_sync
block has its turn. - The block is done so the tasks on the main thread can resume.
viewDidLoad
method is done, and the main queue carries on processing other tasks.
dispatch_sync
adds a task to a queue and waits until that task completes. dispatch_async
does the exact same thing, but the only exception is that it doesn’t wait for the task to complete before proceeding onwards from the calling thread.
dispatch_async Revisited
- (void)viewDidLoad { [super viewDidLoad]; dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{ NSLog(@"First Log"); }); NSLog(@"Second Log"); } |
- The main queue chugs along executing tasks in order — up next is a task to instantiate
UIViewController
which includesviewDidLoad
. viewDidLoad
executes on the main thread.- The main thread is currently inside
viewDidLoad
and is just about to reachdispatch_async
. - The
dispatch_async
block is added to a global queue and will execute at a later time. viewDidLoad
continues to move on after addingdispatch_async
to the global queue and the main thread turns its attention to the remaining tasks. Meanwhile, the global queue is concurrently processing its outstanding tasks. Remember that blocks will be dequeued in a FIFO order on a global queue but can be executed concurrently.- The block of code added by
dispatch_async
is now executing. - The
dispatch_async
block is done and bothNSLog
statements have placed their output on the console.
In this particular instance, the second NSLog
statement executes, followed by the first NSLog
statement. This isn’t always the case — it’s dependent on what the hardware is doing at that given time, and you have no control nor knowledge as to which statement will execute first. The “first” NSLog
could be the first log to execute in some invocations.
Where to Go From Here?
In this tutorial, you learned how to make your code thread safe and how to maintain the responsiveness of the main thread while performing CPU intensive tasks.
You can download the GooglyPuff Project which contains all the improvements made in this tutorial so far. In the second part of this tutorial you’ll continue to improve upon this project.
If you plan on optimizing your own apps, you really should be profiling your work with the Time Profile template in Instruments. Using this utility is outside the scope of this tutorial, so check out How to Use Instruments for a excellent overview.
Also make sure that you profile with an actual device, since testing on the Simulator can give a very inaccurate picture of the program’s speed.
In the next tutorial you’ll dive even deeper into GCD’s API to do even more cool stuff.
If you have any questions or comments, feel free to join the discussion below!
Grand Central Dispatch In-Depth: Part 1/2 is a post from: Ray Wenderlich
The post Grand Central Dispatch In-Depth: Part 1/2 appeared first on Ray Wenderlich.