Create iOS 7 blur effect with latest APIs

Blurred overlay effect itself was at the table for ages, though, incorporating it in iOS 7 gave a huge shot to it’s popularity. Creating an iOS 7 blur effect basically involves two step, create a snapshot of the underlying content of some context, and apply a blur on it. There are numerous approach out there for each, this solution involves some fresh APIs.

There are numerous approach out there to create iOS 7 blur effect, this solution involves some fresh iOS 7 UIView APIs, along with new GPUImage filters.

There are numerous approach out there to create iOS 7 blur effect, this solution involves some fresh iOS 7 UIView APIs, along with new GPUImage filters.

Create UIView snapshot with the new iOS 7 UIView API

I suspect the iOS 7 blur effect itself as the root cause of these iOS 7 UIView APIs. With iOS 7 every UIView has some handy methods to provide Capturing a View Snapshot .

The method drawViewHierarchyInRect:afterScreenUpdates: provides nearly the same as it’s CALayer predecessor renderInContext:, but this one captures the actual onscreen content (this is the only way to capture SpriteKit content for example).

So the first step of creating iOS 7 blur effect looks like this.

// Snapshot scene into a UIImage.
UIGraphicsBeginImageContext(snapshotBounds.size);
[self drawViewHierarchyInRect:snapshotBounds afterScreenUpdates:YES];
UIImage *snapshotImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

You can specify a smaller bounding rectangle for the snapshot, by which you can trade some performance. A blurred image actually don’t have to be at full resolution, since users can hardly perceive the difference.

Using GPUImage blur filters to create iOS 7 blur effect

Apply blur to this image can be done several ways, like using CIFilter, or some iOS Stack blur implementation. Though, my preference is Brad Larson’s GPUImage, especially since it has released an update on the other day that incorporates an API called GPUImageiOSBlurFilter.

The main reason picking this library is performace. Every processing inside is implemented in vanilla OpenGL, so it hardly can be more performant actually. The library is well maintained, and must say it has way more capabilities beyond this blur effect anyway. So hook up the static library in your project and apply the effect to the snapshot created before with three lines.

// Create filter.
self.blurFilter = [GPUImageiOSBlurFilter new];

// Apply filter.
UIImage *blurredSnapshotImage = [self.blurFilter imageByFilteringImage:snapshotImage];

There you go, now you can use it in your hierarchy of your taste. You can still experiment with other blur implementations this frameworks provides, like GPUImageGaussianBlurFilter or GPUImageBoxBlurFilter.

You may take a look into the GPUImageiOSBlurFilter parameters to finetune results once feeling exporous to. My favourite is downsampling that scales down the image before blur, then scale back the result when done. This is nearly the same consideration that underlies beneath capturing a smaller view snapshot I suggested before.

DISCLAIMER. THE INFORMATION ON THIS BLOG (INCLUDING BUT NOT LIMITED TO ARTICLES, IMAGES, CODE SNIPPETS, SOURCE CODES, EXAMPLE PROJECTS, ETC.) IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE INFORMATION ON THIS BLOG (INCLUDING BUT NOT LIMITED TO ARTICLES, IMAGES, CODE SNIPPETS, SOURCE CODES, EXAMPLE PROJECTS, ETC.).