Among the many visual changes of iOS 7, one of the more appealing is the subtle use of blurs throughout the OS. Many third-party apps have already adopted this design detail, and are using it in all sorts of wonderful and creative ways.
This tutorial will walk you through several different techniques for implementing iOS 7 blur effects, all with the assistance of a framework called GPUImage.
Created by Brad Larson, GPUImage is a framework which, by taking advantage of the GPU, makes it incredibly easy to apply different effects and filters to both images and videos, whilst maintaining great performance; often more performant than any of the built-in methods provided by Apple’s APIs.
Getting Started
Download the starter project here and extract it to a convenient location on your drive.
Open Video Blurring.xcodeproj in Xcode and run it on your device. It will look similar to the following:
Tap the Menu in the upper-left of the screen (the three horizontal stripes). You’re presented with two options: record a new video or play back an existing video.
Notice how all of the user interface elements have a gray backdrop; that’s rather dull. You’ll be replacing those dull gray backgrounds with some nice iOS 7 blur effects instead.
Why Use Blurs?
Beyond looking cool, blurs communicate three important concepts to the users of your apps:
depth,context and focus.
Depth
Depth provides cues to the user that the interface is layered, and helps them understand how to navigate your app. Previous iOS versions communicated depth with three-dimensional bevels and glossy buttons reflecting an emulated light source, but iOS 7 communicates depth using blurs and parallax.
The parallax effect is evident when you tilt your iOS 7 device from side-to-side. You’ll notice the icons appear to move independently from the background. This provides cues to the user that the interface is composed of different layers, and that important elements sit on top of other less important interface elements — which leads into the next concept: context.
Context
Context allows a user to get a sense of bearing within your app. Animated transitions provide excellent context; instead of having a new view instantly appear when you tap a button, animating between the views gives the user a moment to understand where the new view originates from, and how they may get back to the previous one.
Blurs allow you to show the previous view in the background, albeit out of focus, to give the user even more context as to where they were a moment ago. The Notification Center is a great example of this; when you pull it down, you can still see the original view in the background whilst you work on another task in the foreground.
Focus
Focusing on selective items removes clutter and lets the user navigate quickly through the interface. Users will instinctively ignore elements that are blurred, focusing instead on the more important, in-focus elements in the view.
You will implement two different types of blurs in this tutorial: Static Blur and Dynamic Blur. Static blurs represent a snapshot in time and do not reflect changes in the content below them. In most cases, a static blur works perfectly fine. Dynamic blurs, in contrast, update as the content behind them changes.
It’s much more exciting to see things in action than to talk about them, so head right into the next section to get started on adding some iOS 7 blur effects!
Adding Static Blur
The first step in creating a static blur is converting the current on-screen view into an image. Once that’s done, you simply blur that image to create a static blur. Apple provides some wonderful APIs to convert any view into an image — and there are some new ones in iOS 7 to do it even faster.
These new APIs are all part of Apple’s new snapshot APIs. The snapshotting APIs give you the ability to capture not just a single view, but also the entire view hierarchy. That means if you instruct it to capture a view, it will also capture all the buttons, labels, switches and various views that are placed on top of it.
You’ll implement this capture logic as a category for UIView
. That way, you can quickly and easily convert any view and its contained view hierarchy into an image — and get some code reuse to boot!
Creating your Screenshot Category
Go to File/New/File… and select the iOS/Cocoa Touch/Objective-C category, like so:
Name the category Screenshot and make it a category on UIView, as shown below:
Add the following method declaration to UIView+Screenshot.h:
-(UIImage *)convertViewToImage; |
Next, add the following method to UIView+Screenshot.m:
-(UIImage *)convertViewToImage { UIGraphicsBeginImageContext(self.bounds.size); [self drawViewHierarchyInRect:self.bounds afterScreenUpdates:YES]; UIImage *image = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); return image; } |
The method above starts with a call to UIGraphicsBeginImageContext()
and ends with UIGraphicsEndImageContext()
. These two lines are bookends for what is known as the image context. A context can be one of several things; it can be the screen, or as in this case, an image. The net effect of these two lines is an off-screen canvas on which to draw the view hierarchy.
drawViewHierarchyInRect:afterScreenUpdates:
takes the view hierarchy and draws it onto the current context.
Finally, UIGraphicsGetImageFromCurrentImageContext()
retrieves the generated UIImage
from the image context, which is the object returned by this method.
Now that you have a category to hold this logic, you’ll need to import it in order to use it.
Add the following import to the top of DropDownMenuController.m, just below the other import statements:
#import "UIView+Screenshot.h" |
Add the following method to the end of the same file:
-(void)updateBlur { UIImage *image = [self.view.superview convertViewToImage]; } |
Here you ensure you capture not just the view but its superview as well. Otherwise, you’d just capture the menu alone.
Testing Your Capture with Breakpoints
To test out your code, add a breakpoint on the line directly below the call to convertViewToImage
. The program will halt execution when it hits the breakpoint and you can view the captured image to make sure your code is functioning properly.
To add a breakpoint, click in the margin to the left of the line and Xcode with mark the spot with a blue arrow, as shown below:
The only thing left to do before you test things out is to call your new method.
Scroll up to show
and add a call to updateBlur
, directly below addToParentViewController
:
-(void)show { [self addToParentViewController]; [self updateBlur]; // Add this line CGRect deviceSize = [UIScreen mainScreen].bounds; [UIView animateWithDuration:0.25f animations:^(void){ _blurView.frame = CGRectMake(0, 0, deviceSize.size.height, MENUSIZE); _backgroundView.frame = CGRectMake(0, 0, _backgroundView.frame.size.width, MENUSIZE); }]; } |
Build and run the app; tap the menu button and you’ll see that Xcode stops at your breakpoint, as shown below:
To preview the image, select image in the lower left hand pane of the debugger, then click the Quick Look icon as indicated below:
There’s your captured image, as expected.
Displaying the Captured Image
Now it’s a simple matter to display the captured image in the background of your menu.
You’d usually use an instance of UIImageView
to display an image, but since you’ll be using GPUImage to blur the images, you’ll need to use an instance of GPUImageView
instead.
The GPUImage framework has already been added to the project; you just need to import the header.
Add the following import to the top of DropDownMenuController.m, just below the others:
#import <GPUImage/GPUImage.h> |
There’s currently an instance of UIView— _blurView
— that gives the menu its gray background.
There’s a UIView
instance variable named _blurView
, which is used to provide the gray background of the menu. Change the declaration so _blurView
is an instance of GPUImageView
instead, as shown below:
@implementation DropDownMenuController { GPUImageView *_blurView; UIView *_backgroundView; } |
You’ll notice that Xcode is giving you a warning: you’re initialising an instance UIView
, not GPUImageView
, as it expects.
Fix that now. Update the variable assignment in viewDidLoad
to the following:
_blurView = [[GPUImageView alloc] initWithFrame:CGRectMake(0, 0, deviceSize.size.height, 0)]; |
Add the following two lines directly below it, removing the line that sets the background color:
_blurView.clipsToBounds = YES; _blurView.layer.contentsGravity = kCAGravityTop; |
clipToBounds
ensures the contents of the view stay inside the view, while contentsGravity
fixes the image to the top of the image view.
Since _blurView
is already being used as the background, you don’t have to write any extra code to make it visible.
You also need to declare the filter that you will use for blurring.
Add the following declaration to the @implementation
block of DropDownMenuController.m:
GPUImageiOSBlurFilter *_blurFilter; |
Find the breakpoint you added earlier, right click it and select Delete Breakpoint, as follows:
Now would be a great time to initialize your blur filter. Add the following code to DropDownMenuController.m:
-(void)updateBlur { if(_blurFilter == nil){ _blurFilter = [[GPUImageiOSBlurFilter alloc] init]; _blurFilter.blurRadiusInPixels = 1.0f; } UIImage *image = [self.view.superview convertViewToImage]; } |
Note that the blur radius is set to only a single pixel; you’re temporarily setting this to a low value so that you can ensure the image is properly positioned. Once you’re happy you’ll increase it later.
Now you need to display the image in the GPUImageView
. However, you can’t simply add an instance of UIImage
to a GPUImageView
as you would with a UIImageView
— you first need to create a GPUImagePicture
.
Add the following line to the bottom of updateBlur
:
GPUImagePicture *picture = [[GPUImagePicture alloc] initWithImage:image]; |
At this point, you now have your image, blur filter and image view.
Add the following code to the bottom of updateBlur
:
[picture addTarget:_blurFilter]; [_blurFilter addTarget:_blurView]; [picture processImage]; |
These statements act as the glue which bonds everything together. You add the filter as a target of the picture, and then the image view as a target of the filter.
The really neat part of all this is that the image processing is taking place on the GPU; this means that the user interface won’t stall while the blur is calculated and displayed. The final result will simply show up in the image view when the processing is complete. Typically this doesn’t take much time at all, but it never hurts to let the GPU do the heavy lifting when appropriate.
Build and run your app; click on the menu and you’ll see something similar to the following:
That looks a little odd, doesn’t it? What you’re seeing now is the resized image shrunk to fit inside the menu view. To correct this, you’ll need to specify what part of the image you want to show inside the GPUImageView
— namely, the top half of your captured and processed view.
Setting contentsRect
Modify show
in DropDownMenuController.m as follows:
-(void)show { [self addToParentViewController]; [self updateBlur]; CGRect deviceSize = [UIScreen mainScreen].bounds; [UIView animateWithDuration:0.25f animations:^(void){ _blurView.frame = CGRectMake(0.0f, 0.0f, deviceSize.size.height, MENUSIZE); _backgroundView.frame = CGRectMake(0.0f, 0.0f, _backgroundView.frame.size.width, MENUSIZE); _blurView.layer.contentsRect = CGRectMake(0.0f, 0.0f, 1.0f, MENUSIZE / 320.0f); // Add this line! }]; } |
By specifying the contentsRect
you define the rectangle, in the unit coordinate space, that indicates the portion of the layer’s contents that should be used.
Build and run your app, tap the menu button, and…
Uh, that still doesn’t look right! You’re using the correct part of the image, but it’s still scaling inappropriately! The piece you’re missing is the correct content scaling.
Add the following line to show
, at the bottom of the animation block:
_blurView.layer.contentsScale = (MENUSIZE / 320.0f) * 2; |
The contentsScale
property defines the mapping between the logical coordinate space of the layer (measured in points) and the physical coordinate space (measured in pixels). Higher scale factors indicate that each point in the layer is represented by more than one pixel at render time.
Build and run your app, hit the Menu button and check if the scaling works properly:
Yeah — that looks much better! Now close the app, re-open it, and…uh-oh, what’s happened?
Well that’s a bit problematic, to say the least. If you reset the contentScale
back to 2.0 before you animate the view, it fixes the half bar problem.
Add the following line to show
in DropDownMenuController.m, just above the animation block:
_blurView.layer.contentsScale = 2.0f; |
Build and run your app; tap Menu, close the menu, and then tap Menu again. What does your menu look like now?
The half-size black box is no longer a problem — but now you have a full-size black box to contend with!
Resetting Blur Filters
This issue comes about the second time you calculate the blur; the proper way to solve this is to remove all of the targets from the blur which resets the filter. If you don’t then the filter call outputs nothing at all.
Update updateBlur
as shown below:
-(void)updateBlur { if(_blurFilter == nil){ _blurFilter = [[GPUImageiOSBlurFilter alloc] init]; _blurFilter.blurRadiusInPixels = 1.0f; } UIImage *image = [self.view.superview convertViewToImage]; GPUImagePicture *picture = [[GPUImagePicture alloc] initWithImage:image]; [picture addTarget:_blurFilter]; [_blurFilter addTarget:_blurView]; [picture processImageWithCompletionHandler:^{ [_blurFilter removeAllTargets]; }]; } |
Here you’ve replaced processImage
with processImageWithCompletionHandler:
. This new method has a completion block that runs once the image processing is complete. Once the image is completely processed, you can safely remove all of the targets.
Build and run your app; tap the Menu and see if the black box issue is gone for good:
Open and close the menu a few more times to make sure you’ve squashed that bug for good.
Look closely at the blur effect as the menu opens — something doesn’t look quite right. To get a closer look, slow down the animation to see what’s happening in slow motion.
Update the duration of the animation block in show
to 10.0f
.
Build and run your app, tap Menu, and watch the menu appear in slow motion:
Ah, now you can see what’s wrong. The blurred image is sliding in from the top. You really want it to appear as if the blur effect itself is sliding down the screen.
Aligning the Background Image
This is where you need to play some tricks with the static blur. When the menu comes down, you need to align the blur with the backdrop instead. So instead of shifting the image view down, you need to expand it, starting at size zero and expanding to full size. This ensures the image will stay in place as the menu opens.
You already have the menu open to the full size in show
— you just need to set the height of contentRect
to zero when the image view is first created and when it is hidden.
Add the following code to viewDidLoad
in DropDownMenuController.m, just below where you initialize _blurView
and set it’s initial properties:
_blurView.layer.contentsRect = CGRectMake(0.0f, 0.0f, 1.0f, 0.0f); |
Still in the same file, add the following line to the bottom of the animation block in
_blurView.layer.contentsRect = CGRectMake(0.0f, 0.0f, 1.0f, 0.0f); |
The contentRect
property can be animated as well; therefore the original and updated rects will be interpolated automatically during the animation.
Build and run your app. Fortunately you still have the animation running slowly, so it’s easy to check if you’ve fixed the issue:
That looks much more natural. You now have a slide-in menu with a blurred background.
Before moving on, re-adjust the little bits of code you changed for testing purposes.
Go to the show method and change the duration of the animation block to 0.25.
Next, in updateBlur
, change the value of _blurFilter.blurRadiusInPixels
to 4.0f.
Build and run your app; pop open the menu a few times to see what it looks like now:
Live Blurring
Live blurring is a technically difficult issue to solve. In order to do live blurring effectively you need to capture the screen, blur it and display it, all whilst maintaining 60 frames per second. Using GPUImage, blurring images and displaying them at 60 frames per second is no problem.
The real tricky bit? How to capture the live screen images, believe it or not.
Since you are working with capturing the main user interface, you must use the main thread of the CPU to capture the screen and convert it into an image.
A Brief Branch on Threading
When you run a program, you are executing a list of instructions. Each list of instructions runs inside its own thread, and you can run multiple lists of instructions concurrently in separate threads. An app starts on the main thread and new threads are created and executed in the background as necessary. If you haven’t had to manage multiple threads before, you’ve likely always written your apps to run exclusively on the main thread.
The main thread handles interactions and updates to the user interface; it’s critical to make sure that it stays responsive. If you overload the main thread with tasks, you will see the user interface start to stutter or freeze up completely.
If you’ve ever scrolled through the Twitter or Facebook app on your phone, then you’ve seen background threads in action. Not all profile pictures show up immediately as you scroll; the app launches background threads to fetch the image, and once the image has been retrieved it then shows up on-screen.
Without background threads, the scrolling table view would freeze while it tried to retrieve each and every profile image; since retrieving an image can take several seconds, it’s best passed off to a background thread to keep the user interface smooth and responsive.
So how does this affect your app? The UIView snapshot APIs covered earlier must be run on the main thread. This means that each time they run, the entire interface will freeze for a moment before it continues on.
For static blurs, this action happens so fast that you don’t notice it. You only need to capture the screen once. However live blur effects need to capture the screen 60 times a second. If you performed live captures like this on the main thread, animations and transitions would become choppy and stuttered.
Even worse, as the complexity of your user interface increased, the time it would take to capture the interface would also increase, and your app would stutter even more!
What to do, dear reader?
Potential Live Blur Solutions
One solution which many open source live blur libraries use is to slow down the capture frame rate. So instead of capturing the interface every 60 seconds, you capture it maybe 20, 30 or 40 times a second. Even though it doesn’t seem like much of a difference, your eye will pick up the delay. You’ll notice that the blur is out of sync with the rest of the app — and it sometimes looks worse than doing no blurring at all.
Apple handles live blur in some of their apps without issue — but unfortunately they haven’t made this API public yet. The UIView
snapshot APIs of iOS 7 are a huge improvement over the old ways of doing things, but they aren’t quite fast enough for live blurring.
Some developers have seized on the blurring features of the UIToolbar
to do their dirty work. Yes, it works, but it’s strongly advised that you do NOT use that in your production apps. Sure, it isn’t a private API, but it is an unsupported feature and Apple may reject your app should you use it. This means there are zero guarantees or promises that it will continue to work this way in your app under future versions of iOS 7.
Apple could modify the UIToolbar
at any point and break your app in ugly ways. In the iOS 7.0.3 update, Apple modified the effect in the UIToolbar
and UINavigationBar
and some developers reported that the effect stopped working altogether. Don’t fall into this trap!
A Compromise — Blurring Live Video
All right, you might have to concede that live blurring in your apps isn’t possible at the moment. So what is possible right now, given the limitations on live blurring?
Static blurring is an acceptable compromise in many situations. In the previous section, you modified the view to make it appear as if the view was actually blurring the image behind it with a bit of visual trickery. As long as the view behind it doesn’t move, a static blur usually fits the bill. You can also achieve some nice effects by fading in the blurred background.
Do some experimenting and see if you can find some effects to minimize your inability to perform live blurs.
One thing you can do is blur live video. The bottleneck again is capturing the screen, but GPUImage is very powerful and capable of blurring videos, either live video from the camera or pre-recorded video.
Blurring Video With GPUImage
The process for blurring videos with GPUImage
is very similar to blurring an image. With images, you take an instance of GPUImagePicture
, send it to a GPUImageiOSBlurFilter
and then send that on to a GPUImageView
.
In a similar fashion, you’ll take an instance of GPUImageVideoCamera
or GPUImageMovie
, send it to a GPUImageiOSBlurFilter
and then send it to a GPUImageView
. GPUImageVideoCamera
is used for the live camera on the iPhone, while GPUImageMovie
is used for prerecorded videos.
Instances of GPUImageVideoCamera
and are already set up in the starter project. Your job is to replace the gray backgrounds for the play and record buttons with a live blurred view of the videos.
The first thing to do is convert those instances of UIView
providing the gray background to instances of GPUImageView
. Once that is done, you’ll need to adjust the contentRect
for each view based on the frame of the view.
This sounds like a lot of work for each view. To make things a little easier, you’ll create a subclass of GPUImageView
and put your custom code in there so it can be reused.
Go to File/New/File…, and select iOS/Cocoa Touch/Objective-C class, as below:
Name the class BlurView and make it a subclass of GPUImageView, like so:
Open ViewController.m and add the following import to the top of the file:
#import "BlurView.h" |
Still working in ViewController.m, find the declarations for _recordView
and _controlView
right after the @implementation
declaration and modify them to instantiate BlurView
s instead, like so:
BlurView *_recordView; //Update this! UIButton *_recordButton; BOOL _recording; BlurView *_controlView; //Update this too! UIButton *_controlButton; BOOL _playing; |
Modify viewDidLoad as follows:
_recordView = [[BlurView alloc] initWithFrame: CGRectMake(self.view.frame.size.height/2 - 50, 250, 110, 60)]; //Update this! //_recordView.backgroundColor = [UIColor grayColor]; //Delete this! _recordButton = [UIButton buttonWithType:UIButtonTypeCustom]; _recordButton.frame = CGRectMake(5, 5, 100, 50); [_recordButton setTitle:@"Record" forState:UIControlStateNormal]; [_recordButton setTitleColor:[UIColor redColor] forState:UIControlStateNormal]; [_recordButton setImage:[UIImage imageNamed:@"RecordDot.png"] forState:UIControlStateNormal] ; [_recordButton addTarget:self action:@selector(recordVideo) forControlEvents:UIControlEventTouchUpInside]; [_recordView addSubview:_recordButton]; _recording = NO; _recordView.hidden = YES; [self.view addSubview:_recordView]; _controlView = [[BlurView alloc] initWithFrame: CGRectMake(self.view.frame.size.height/2 - 40, 230, 80, 80)]; //Update this! //_controlView.backgroundColor = [UIColor grayColor]; //Delete this! |
Now you need to create the blurred image to show in those image views. Head back to the @implementation
block and add the following two declarations:
GPUImageiOSBlurFilter *_blurFilter; GPUImageBuffer *_videoBuffer; |
You already know what GPUImageiOSBlurFilter
does; what’s new here is GPUImageBuffer
. This takes the video output and captures one frame so you can easily blur the image. As an added bonus, this also helps improve the performance of your app!
Normally you would send the output of the video through the blur filter and then onto the background view where it will be displayed. However, when you use a buffer, you send the output of the video to the buffer which then splits it into the background view and the blur filter. Doing this smoothes the video output display.
Add the following code to the very top of viewDidLoad, just below the call to super
:
_blurFilter = [[GPUImageiOSBlurFilter alloc] init]; _videoBuffer = [[GPUImageBuffer alloc] init]; [_videoBuffer setBufferSize:1]; |
Still working in the same file, add the highlighted statements to useLiveCamera
:
-(void)useLiveCamera { if (![UIImagePickerController isSourceTypeAvailable: UIImagePickerControllerSourceTypeCamera]) { UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"No camera detected" message:@"The current device has no camera" delegate:self cancelButtonTitle:@"Ok" otherButtonTitles:nil]; [alert show]; return; } _liveVideo = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset1280x720 cameraPosition:AVCaptureDevicePositionBack]; _liveVideo.outputImageOrientation = UIInterfaceOrientationLandscapeLeft; [_liveVideo addTarget:_videoBuffer]; //Update this [_videoBuffer addTarget:_backgroundImageView]; //Add this [_videoBuffer addTarget:_blurFilter]; //And this [_blurFilter addTarget:_recordView]; //And finally this [_liveVideo startCameraCapture]; _recordView.hidden = NO; _controlView.hidden = YES; } |
This results in a blurred background for the recording controls.
You’ll need to do something similar for the play controls.
Add the following code to loadVideoWithURL:
, just below the _recordedVideo.playAtActualSpeed = YES;
statement:
[_recordedVideo addTarget:_videoBuffer]; [_videoBuffer addTarget:_backgroundImageView]; [_videoBuffer addTarget:_blurFilter]; [_blurFilter addTarget:_controlView]; |
Build and run; bring up the recording controls and see how things look:
The good news is that it works — mostly. The bad news is that the entire screen has been scaled down inside the button. This sounds like a similar problem to what you had before. You’ll need to set the contentRect
appropriately for the BlurView
.
Open BlurView.m and replace the boilerplate initWithFrame:
with the following code:
- (id)initWithFrame:(CGRect)frame { self = [super initWithFrame:frame]; if (self) { CGRect deviceSize = [UIScreen mainScreen].bounds; self.layer.contentsRect = CGRectMake(frame.origin.x/deviceSize.size.height, frame.origin.y/deviceSize.size.width, frame.size.width/deviceSize.size.height, frame.size.height/deviceSize.size.width); self.fillMode = kGPUImageFillModeStretch; } return self; } |
Each argument of the contentRect
must be be between 0.0f and 1.0f. Here you simply take the location of the view and divide it by the size of the screen to get the numbers you need.
Build and run your app, and take a look at your new improved controls:
Congratulations! You have successfully implemented a static blur and a live video blur into your project. You are now fully armed with the knowledge to be able to add these iOS 7 blur effects into your own apps!
Where to Go from Here?
You can download the completed project here.
This tutorial has taught you not only about using iOS 7 blur effects in your app, but also how to make use of the GPUImage framework, which as I hope you’ve seen is a very powerful and capable framework. Importantly, we also touched on why blur, when used properly, is a key aspect of the new iOS 7 design language. Hopefully Apple will provide access to the same API’s they’re using in a future update to the SDK, but until that happens GPUImage is a cracking substitute.
Blurring is just the very beginning of what to you’re able to do using GPUImage. The process discussed in this tutorial can be applied to a wide range of different filters and effects. You can find all of them in the documentation.
Go forth and blur!
Hope you enjoyed the tutorial! If you have any questions or comments, please join the forum discussion below!
iOS 7 Blur Effects with GPUImage is a post from: Ray Wenderlich
The post iOS 7 Blur Effects with GPUImage appeared first on Ray Wenderlich.