Quantcast
Channel: Kodeco | High quality programming tutorials: iOS, Android, Swift, Kotlin, Unity, and more
Viewing all 4373 articles
Browse latest View live

Video Tutorial: Saving Data in iOS Part 10: Core Data


iOS 8 Metal Tutorial with Swift: Getting Started

$
0
0
Learn how to use Apple's new API for GPU-accelerated 3D graphics: Metal!

Learn how to use Apple’s new API for GPU-accelerated 3D graphics: Metal!

In iOS 8, Apple released a new API for GPU-accelerated 3D graphics called Metal.

Metal is similar to OpenGL ES, in that it is a low-level API for interacting with 3D graphics hardware.

The difference is that Metal is not cross platform. Instead, it is designed to be extremely efficient with Apple hardware, offering much improved speed and low overhead compared to using OpenGL ES.

In this tutorial, you’ll get some hands-on experience using Metal and Swift to create a bare-bones app: drawing a simple triangle. In the process, you’ll learn about some of the most important classes in Metal, such as devices, command queues, and more.

This tutorial is designed so that anyone can go through it, regardless of your 3D graphics background – however, we will go fairly quickly. If you do have some prior 3D programming or OpenGL experience you will find things much easier, as many of the concepts you’re already familiar with apply to Metal.

This tutorial assumes you are familiar with Swift. If you are new to Swift, check out Apple’s Swift site or some of our Swift tutorials first.

Note: Metal apps do not run on the iOS simulator – they require a device with an Apple A7 chip or later. So to go through this tutorial, you will need one of these devices (an iPhone 5S, iPad Air, or iPad mini (2nd generation) at the time of writing this tutorial).

Metal vs. Sprite Kit, Scene Kit, or Unity

1_Metal_vs_3

Before we get started, I wanted to discuss how Metal compares to higher level frameworks like Sprite Kit, Scene Kit, or Unity.

Metal is a low-level 3D graphics API, similar to OpenGL ES but lower overhead. It is a very thin layer above the GPU, so doing just about anything (such as rendering a sprite or a 3D model to the screen) requires you to write all of the code to do this. The tradeoff is you have full power and control.

Higher level game frameworks like Sprite Kit, Scene Kit, or Unity are built on top of a lower-level 3D graphics API like Metal or OpenGL ES. They provide much of the boilerplate code you normally need to write in a game, such as rendering a sprite or 3D model to the screen.

2_Boxes

If all you’re trying to do is make a game, most of the time I’d recommend you use a higher level game framework like Sprite Kit, Scene Kit, or Unity because it will make your life much easier. If this sounds like you, we have tons of tutorials to help get you started with these frameworks.

However, there are still two really good reasons to learn Metal:

  1. Push the hardware to its limits: Since Metal is at such a low level, it allows you to really push the hardware to its limits and have full control over how your game works.
  2. It’s a great learning experience: Learning Metal teaches you a lot about 3D graphics, writing your own game engine, and how higher level game frameworks work.

If either of these sound like good reasons to you, keep reading!

Metal vs OpenGL ES

3_Metal_vs_opengles

Next let’s take a look at the difference between Metal and OpenGL ES.

OpenGL ES has been designed to be cross-platform. That means you can write C++ OpenGL ES code, and most of the time with some small modifications can run it on other platforms (such as Android).

Apple realized that although the cross-platform support of OpenGL ES was nice, it was missing something fundamental to how Apple designs its products: how Apple integrates the operating system, hardware, and software as a complete package.

So Apple took a clean-room approach to see what it would look like if they were to design a graphics API specifically for their hardware with the goal of being extremely low overhead and performant, and supporting the latest and greatest features.

The result was Metal – which can provide up to 10x the number of draw calls for your application compared to OpenGL ES. This can result in some amazing effects, like you may remember from the Zen Garden example in the WWDC 2014 keynote.

Let’s dig right in and see some Metal code!

Getting Started

Xcode’s iOS Game template comes with a Metal option, but you are not choosing that here. This is because I want to show you how to put together a Metal app from scratch, so you can understand every step of the process.

So open Xcode 6 and make a new project with the iOS\Application\Single View Application template. Enter HelloMetal for the ProductName, set the Language to Swift, and set Devices to Universal. Click Next, choose a directory, and click Create.

There are 7 steps to set up metal:

  1. Create a MTLDevice
  2. Create a CAMetalLayer
  3. Create a Vertex Buffer
  4. Create a Vertex Shader
  5. Create a Fragment Shader
  6. Create a Render Pipeline
  7. Create a Command Queue

Let’s go through them one at a time.

1) Create a MTLDevice

The first thing you need to do to use Metal is to get a reference to a MTLDevice.

You can think of a MTLDevice as your direct connection to the GPU. You will create all the other Metal objects you need (like command queues, buffers, and textures) using this MTLDevice.

To do this, open ViewController.swift and add this import to the top of the file:

import Metal

This imports the Metal framework so that you can use Metal classes (like MTLDevice inside this file.

Next, add this property to the ViewController class:

var device: MTLDevice! = nil

You are going to initialize this property in viewDidLoad() rather than in an initializer, so it has to be an optional. Since you know you’re definitely going to initialize it before you use it, you mark it as an implicitly unwrapped optional for convenience purposes.

Finally, add this line to the end of viewDidLoad():

device = MTLCreateSystemDefaultDevice()

This function returns a references to the default MTLDevice that your code should use.

2) Create a CAMetalLayer

In iOS, everything you see on screen is backed by a CALayer. There are subclasses of CALayers for different effects, like gradient layers, shape layers, replicator layers, and more.

Well, if you want to draw something on the screen with Metal, you need to use a special subclass of CALayer called CAMetalLayer. So let’s add one of these to your view controller.

First add this import to the top of the file:

import QuartzCore

You need this because CAMetalLayer is part of the QuartzCore framework, not Metal framework.

Then add this new property to the class:

var metalLayer: CAMetalLayer! = nil

This will store a handy reference to your new layer.

Finally, add this code to the end of viewDidLoad():

metalLayer = CAMetalLayer()          // 1
metalLayer.device = device           // 2
metalLayer.pixelFormat = .BGRA8Unorm // 3
metalLayer.framebufferOnly = true    // 4
metalLayer.frame = view.layer.frame  // 5
view.layer.addSublayer(metalLayer)   // 6

Let’s go over this line-by-line:

  1. You create a new CAMetalLayer.
  2. You must specify the MTLDevice that the layer should use. You simply set this to the device you obtained earlier.
  3. You set the pixel format to BGRA8Unorm, which is a fancy way of saying “8 bytes for Blue, Green, Red, and Alpha, in that order – with normalized values between 0 and 1″. This is one of only 2 possible formats to use for a CAMetalLayer, so normally you just leave this as-is.
  4. Apple encourages you to set framebufferOnly to true for performance reasons unless you need to sample from the textures generated for this layer, or if you need to enable compute kernels on the layer drawable texture (most of the time, you don’t need to do this).
  5. You set the frame of the layer to match the frame of the view.
  6. You add the layer as a sublayer of the view’s main layer.

3) Create a Vertex Buffer

Everything in Metal is a triangle. In this app, you’re just going to draw one triangle, but even complex 3D shapes can be decomposed into a series of triangles.

In Metal, the default coordinate system is the normalized coordinate system, which means that by default you are looking at a 2x2x1 cube centered at (0, 0, 0.5).

If you consider the Z=0 plain, then (-1, -1, 0) is the lower left, (0, 0, 0) is the center, and (1, 1, 0) is the upper right. In this tutorial, you want to draw a triangle with these there points:

4_vertices

Let’s create a buffer for this. Add the following constant property to your class:

let vertexData:[Float] = [
  0.0, 1.0, 0.0,
  -1.0, -1.0, 0.0,
  1.0, -1.0, 0.0]

This creates an array of floats on the CPU – you need to send this data to the GPU by moving it to something called a MTLBuffer.

Add another new property for this:

var vertexBuffer: MTLBuffer! = nil

Then add this code to the end of viewDidLoad():

let dataSize = vertexData.count * sizeofValue(vertexData[0]) // 1
vertexBuffer = device.newBufferWithBytes(vertexData, length: dataSize, options: nil) // 2

Let’s go over this line-by-line:

  1. You need to get the size of the vertex data in bytes. You do this by multiplying the size of the first element by the count of elements in the array.
  2. You call newBufferWithBytes(length:options:) on the MTLDevice you created earlier to create a new buffer on the GPU, passing in the data from the CPU. You pass nil to accept the default options.

4) Create a Vertex Shader

The vertices you created in the previous section will become the input to a little program you will write called a vertex shader.

A vertex shader is simply a tiny program that runs on the GPU, written in a C++-like language called the Metal Shading Language.

A vertex shader is called once per vertex, and its job is to take that vertex’s information (like position and possibly other information such as color or texture coordinate), and return a potentially modified position (and possibly other data).

To keep things simple, your simple vertex shader will return the same position as the position passed in.

5_matrix

The easiest way to understand vertex shaders is to see it yourself. Go to File\New\File, choose iOS\Source\Metal File, and click Next. Enter Shaders.metal for the filename and click Create.

Note: In Metal, you can include multiple shaders in a single Metal file if you would like. You can also split your shaders across multiple Metal files if you would like – Metal will load shaders from any Metal file included in your project.

Add the following code to the bottom of Shaders.metal:

vertex float4 basic_vertex(                           // 1
  const device packed_float3* vertex_array [[ buffer(0) ]], // 2
  unsigned int vid [[ vertex_id ]]) {                 // 3
  return float4(vertex_array[vid], 1.0);              // 4
}

Let’s go over this line-by-line:

  1. All vertex shaders must begin with the keyword vertex. The function must return (at least) the final position of the vertex – you do so here by indicating float4 (a vector of 4 floats). You then give the name of the vertex shader – you will look up the shader later using this name.
  2. The first parameter a pointer to an an array of packed_float3 (a packed vector of 3 floats) – i.e. the position of each vertex.


    The [[ ... ]] syntax is used to declare attributes which can be used to specify additional information such as resource locations, shader inputs, and built-in variables. Here you mark this parameter with [[ buffer(0) ]], to indicate that this parameter will be populated by the first buffer of data that you send to your vertex shader from your Metal code.

  3. The vertex shader will also take a special parameter with the vertex_id attribute, which means it will be filled in with the index of this particular vertex inside the vertex array.
  4. Here you look up the position inside the vertex array based on the vertex id and return that. You also convert the vector to a float4, where the final value is 1.0 (long story short, this is required for 3D math purposes).

5) Create a Fragment Shader

After the vertex shader completes, another shader is called for each fragment (think pixel) on the screen: the fragment shader.

The fragment shader gets its input values by interpolating the output values from the vertex shader. For example, consider the fragment between the bottom two vertices of the triangle:

6_points

The input value for this fragment will be the 50/50 blend of the output value of the bottom two vertices.

The job of a fragment shader is to return the final color for each fragment. To keep things simple, you will make each fragment white.

Add the following code to the bottom of Shaders.metal:

fragment half4 basic_fragment() { // 1
  return half4(1.0);              // 2
}

Let’s go over this line-by-line:

  1. All fragment shaders must begin with the keyword fragment. The function must return (at least) the final color of the fragment – you do so here by indicating half4 (a 4-component color value RGBA). Note that half4 is more memory efficient than float4 because you are writing to less GPU memory.
  2. Here you return (1, 1, 1, 1) for the color (which is white).

6) Create a Render Pipeline

Now that you’ve created a vertex and fragment shader, you need to combine them (along with some other configuration data) into a special object called the render pipeline.

One of the cool things about Metal is that the shaders are precompiled, and the render pipeline configuration is compiled after you first set it up, so everything is made extremely efficient.

First add a new property to ViewController.swift:

var pipelineState: MTLRenderPipelineState! = nil

This will keep track of the compiled render pipeline you are about to create.

Next, add the following code to the end of viewDidLoad():

// 1
let defaultLibrary = device.newDefaultLibrary()
let fragmentProgram = defaultLibrary.newFunctionWithName("basic_fragment")
let vertexProgram = defaultLibrary.newFunctionWithName("basic_vertex")
 
// 2
let pipelineStateDescriptor = MTLRenderPipelineDescriptor()
pipelineStateDescriptor.vertexFunction = vertexProgram
pipelineStateDescriptor.fragmentFunction = fragmentProgram
pipelineStateDescriptor.colorAttachments[0].pixelFormat = .BGRA8Unorm
 
// 3
var pipelineError : NSError?
pipelineState = device.newRenderPipelineStateWithDescriptor(pipelineStateDescriptor, error: &pipelineError)
if !pipelineState {
  println("Failed to create pipeline state, error \(pipelineError)")
}

Let’s go over this section by section:

  1. You can access any of the precompiled shaders included in your project through the MTLLibrary object that you can get by calling device.newDefaultLibrary(). Then you can look up each shader by name.
  2. You set up your render pipeline configuration here. It contains the shaders you want to use, and the pixel format for the color attachment (i.e. the output buffer you are rendering to – the CAMetalLayer itself).
  3. Finally you compile the pipeline configuration into a pipeline state that is efficient to use here on out.

7) Create a Command Queue

The final one-time-setup step you need to do is to create a MTLCommandQueue

Think of this as an ordered list of commands that you tell the GPU to execute, one at a time.

To create a command queue, simply add a new property:

var commandQueue: MTLCommandQueue! = nil

And add this line at the end of viewDidLoad():

commandQueue = device.newCommandQueue()

Congrats – your one-time set up code is done!

Rendering the Triangle

Now it’s time to move on to the code that executes each frame, to render the triangle!

This is done in 5 steps:

  1. Create a Display Link
  2. Create a Render Pass Descriptor
  3. Create a Command Buffer
  4. Create a Render Command Encoder
  5. Commit your Command Buffer

Let’s dive in!

Note: In theory this app doesn’t actually need to render things once per frame, because the triangle doesn’t move after it’s drawn! However, most apps do have things moving so we’ll do things this way. It also gives a nice starting point for future tutorials.

1) Create a Display Link

You want a function to be called every time the device screen refreshes so you can re-draw the screen.

On iOS, you do this with the handy CADisplayLink class. To use this, add a new property to the class:

var timer: CADisplayLink! = nil

And initialize it at the end of viewDidLoad() as follows:

timer = CADisplayLink(target: self, selector: Selector("gameloop"))
timer.addToRunLoop(NSRunLoop.mainRunLoop(), forMode: NSDefaultRunLoopMode)

This sets up your code to call a method named gameloop() every time the screen refreshes.

Finally, add these stub methods to the class:

func render() {
  // TODO
}
 
func gameloop() {
  autoreleasepool {
    self.render()
  }
}

Here gameloop() simply calls render() each frame, which right now just has an empty implementation. Let’s flesh this out.

2) Create a Render Pass Descriptor

The next step is to create a MTLRenderPassDescriptor, which is an object that configures what texture is being rendered to, what the clear color is, and a bit of other configuration.

Simply add these lines inside render():

var drawable = metalLayer.nextDrawable()
 
let renderPassDescriptor = MTLRenderPassDescriptor()
renderPassDescriptor.colorAttachments[0].texture = drawable.texture
renderPassDescriptor.colorAttachments[0].loadAction = .Clear
renderPassDescriptor.colorAttachments[0].clearColor = MTLClearColor(red: 0.0, green: 104.0/255.0, blue: 5.0/255.0, alpha: 1.0)

First you call nextDrawable() on the metal layer you created earlier, which returns the texture you need to draw into in order for something to appear on the screen.

Next you configure the render pass descriptor to use that. You set the load action to Clear, which means clear the texture to the clear color before doing any drawing, and you set the clear color to the green color we use on the site.

3) Create a Command Buffer

The next step is to create a command buffer. Think of this as the list of render commands that you wish to execute for this frame. The cool thing is nothing actually happens until you commit the command buffer, giving you fine-grained control over when things occur.

Creating a command buffer is easy. Simply add this line to the end of render():

let commandBuffer = commandQueue.commandBuffer()

A command buffer contains one or more render commands. Let’s create one of these next.

4) Create a Render Command Encoder

To create a render command, you use a helper object called a render command encoder. To try this out, add these lines to the end of render():

let renderEncoder = commandBuffer.renderCommandEncoderWithDescriptor(renderPassDescriptor)
renderEncoder.setRenderPipelineState(pipelineState)
renderEncoder.setVertexBuffer(vertexBuffer, offset: 0, atIndex: 0)
renderEncoder.drawPrimitives(.Triangle, vertexStart: 0, vertexCount: 3, instanceCount: 1)
renderEncoder.endEncoding()

Here you create a command encoder and specify the pipeline and vertex buffer you created earlier.

The most important part is the call to drawPrimitives(vertexStart:vertexCount:instanceCount:). Here you are telling the GPU to draw a set of triangles, based on the vertex buffer. Each triangle consists of 3 vertices, starting at index 0 inside the vertex buffer, and there is 1 triangle total.

When you’re done, you simply call endEncoding().

5) Commit your Command Buffer

The final step is to commit the command buffer. Add these lines to the end of render():

commandBuffer.presentDrawable(drawable)
commandBuffer.commit()

The first line is needed to make sure the new texture is presented as soon as the drawing completes. Then you commit the transaction to send the task to the GPU.

Phew! That was a ton of code, but at long last you are done! Build and run the app and bask in your triangle glory:

The most beautiful triangle I've ever seen!

The most beautiful triangle I’ve ever seen!

Note: If your app crashes, make sure that you are running on an actual device (not the simulator) that has an A7 chip (an iPhone 5S, iPad Air, or iPad mini (2nd generation) at the time of writing this tutorial).

Where To Go From Here?

Here is the final example project from this iOS 8 Metal Tutorial.

Congratulations, you have learned a ton about the new Metal API! You now have an understanding of some of the most important concepts in Metal, such as shaders, devices, command buffers, pipelines, and more.

I may write some more tutorials in this series covering uniforms, moving to 3D, texturing, lighting, and importing models if there’s enough interest – add a comment below if you’d would like to see more!

In the meantime, be sure to check out some great resources from Apple:

I hope you enjoyed this tutorial, and if you have any comments or questions, please join the forum discussion below!

iOS 8 Metal Tutorial with Swift: Getting Started is a post from: Ray Wenderlich

The post iOS 8 Metal Tutorial with Swift: Getting Started appeared first on Ray Wenderlich.

Top 10 Core Data Tools and Libraries

$
0
0
Check out the top 10 Core Data tools to use in your apps!

Check out the top 10 Core Data tools to use in your apps!

Core Data is a great choice for persisting and querying data in your iOS and OSX apps. Not only can it reduce memory usage and improve performance, but it can also save you from writing a lot of unnecessary boilerplate code.

In addition, the Core Data API is extremely flexible, which allows it to be used in a myriad of apps, all with different data storage requirements.

However, this flexibility means that sometimes Core Data can be slightly difficult to work with. Even if you’re a Core Data guru, there’s still a lot of mundane tasks required, and a lot of room to make silly errors.

Luckily, there are a lot of great tools that can help you out and make Core Data significantly easier to work with. Here are our top 10 picks that you should know and love!

Note: Even with these great tools and libraries, you’ll still need a good understanding of Core Data to reap their benefits. If you need a some more experience with Core Data, check out our beginner tutorial.

Also note this article has an Objective-C focus since most Core Data libraries are written in Objective-C at the moment. If you want to learn how to use Core Data with Swift, check out our upcoming book Core Data by Tutorials, which is fully updated for iOS 8 and Swift!

10. RestKit

RestKit is an Objective-C framework for interacting with RESTful web services. It provides a Core Data entity mapping engine that maps serialized response objects directly to managed objects.

The code example below shows you how to set up RestKit to access the OpenWeatherMap API and map the JSON response of the /weather endpoint into a WFWeather managed object:

- (void)loadForecastData {
  RKManagedObjectStore *store = self.managedObjectStore;
 
  // 1
  RKEntityMapping *mapping = [RKEntityMapping mappingForEntityForName:@"WFWeather"
                                                 inManagedObjectStore:store];
  [mapping addAttributeMappingsFromArray:@[@"temp", @"pressure", @"humidity"]];
 
  // 2
  NSIndexSet *statusCodeSet = RKStatusCodeIndexSetForClass(RKStatusCodeClassSuccessful);
  RKResponseDescriptor *responseDescriptor = [RKResponseDescriptor
                                              responseDescriptorWithMapping:mapping
                                              method:RKRequestMethodGET
                                              pathPattern:@"/data/2.5/weather"
                                              keyPath:@"main"
                                              statusCodes:statusCodeSet];
 
  // 3
  NSURL *url = [NSURL URLWithString:
                [NSString stringWithFormat:@"http://api.openweathermap.org/data/2.5/weather?q=Orlando"]];
  NSURLRequest *request = [NSURLRequest requestWithURL:url];
  RKManagedObjectRequestOperation *operation = [[RKManagedObjectRequestOperation alloc]
                                                initWithRequest:request
                                                responseDescriptors:@[responseDescriptor]];
  operation.managedObjectCache = store.managedObjectCache;
  operation.managedObjectContext = store.mainQueueManagedObjectContext;
 
  // 4
  [operation setCompletionBlockWithSuccess:
   ^(RKObjectRequestOperation *operation, RKMappingResult *mappingResult){
     NSLog(@"%@",mappingResult.array);
     [self.tableView reloadData];
   } failure:^(RKObjectRequestOperation *operation, NSError *error) {
     NSLog(@"ERROR: %@", [error localizedDescription]);
   }];
 
  [operation start];
}

Here’s what’s going on in the code above:

  1. First, you create a RKEntityMapping object that tells RestKit how to map API responses to attributes of WFWeather.
  2. Here, RKResponseDescriptor ties responses from /data/2.5/weather to the RKEntityMapping instance above.
  3. RKManagedObjectRequestOperation defines which operation to execute. In this example, you request the weather in Orlando from the OpenWeatherMap API and point the response to the instance of RKResponseDescriptor noted above.
  4. Finally, you execute the operation with the requisite success and failure blocks. When RestKit sees a response coming back that matches the defined RKResponseDescriptor it will map the data directly into your instance of WFWeather.

In the code above there’s no need for manual JSON parsing, checking for [NSNull null], manual creation of Core Data entities, or any of the other routine things that one must do when connecting to an API. RestKit turns API responses into Core Data model objects via a simple mapping dictionary. It doesn’t get much easier than that.

To learn how to install and use RestKit, check out our Introduction to RestKit tutorial.

9. MMRecord

MMRecord is a block-based integration library that uses the Core Data model configuration to automatically create and populate complete object graphs from API responses. It makes generating native objects from web service requests as simple as possible as it creates, fetches, and populates NSManagedObjects instances for you in the background.

The code block below shows how to use MMRecord to perform the same Orlando weather call and data mapping that you did in the above in the RestKit example:

NSManagedObjectContext *context = [[MMDataManager sharedDataManager] managedObjectContext];
 
[WFWeather 
  startPagedRequestWithURN:@"data/2.5/weather?q=Orlando"
                      data:nil
                   context:context
                    domain:self
  resultBlock:^(NSArray *weather, ADNPageManager *pageManager, BOOL *requestNextPage) {
    NSLog(@"Weather: %@", weather);
  }
  failureBlock:^(NSError *error) {
    NSLog(@"%@", [error localizedDescription]);
}];

Without writing any complicated networking code or manually parsing the JSON response, you’ve called an API and populated your Core Data managed objects with the response data in only a few lines of code.

How does MMRecord know how to locate your objects in the API response? Your managed objects must be subclasses of MMRecord and override keyPathForResponseObject as shown below:

@interface WFWeather : MMRecord
@property (nonatomic) float temp;
@property (nonatomic) float pressure;
@property (nonatomic) float humidity;
@end
 
@implementation WFWeather
@dynamic temp;
@dynamic pressure;
@dynamic humidity;
 
+ (NSString *)keyPathForResponseObject {
    return @"main";
}
 
@end

keyPathForResponseObject returns a key path that specifies the location of this object relative to the root of the response object from the API. In this case the key path is main for the data/2.5/weather call.

It’s not all magic — MMRecord does require that you create a server class that knows how to make requests against the API you’re integrating against. Thankfully, MMRecord comes with sample AFNetworking-based server classes.

For complete information on setting up and using MMRecord, the readme on the MMRecord Github repository is the best place to start.

8. Magical Record

Modeled after Ruby on Rails’ ActiveRecord system, MagicalRecord provides a set of classes and categories that enable one-line entity fetch, insertion and deletion operations.

Here’s a view of MagicalRecord in operations:

// Fetching
NSArray *people = [Person MR_findAll];
 
// Creating
Person *myPerson = [Person MR_createEntity];
 
// Deleting
[myPerson MR_deleteEntity];

MagicalRecord makes it easy to set up your Core Data stack. Instead of having many lines of boilerplate code, you can set up a full Core Data stack with only one method call in your AppDelegate file as follows:

- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
  // 1
  [MagicalRecord setupCoreDataStackWithStoreNamed:@"ExampleDatabase.sqlite"];
 
  return YES;
}

You call setupCoreDataStackWithStoreNamed in application:didFinishLaunchingWithOptions: with the name of your SQLite file. This sets up your instances of NSPersistentStoreCoordinator, NSManagedObjectModel and NSManagedObjectContext so that you’re all ready to work with Core Data.

Have a look at our MagicalRecord tutorial for further information on how to install and use MagicalRecord.

7. GDCoreDataConcurrencyDebugging

Concurrency issues are some of the hardest things to debug in Core Data. The performBlock APIs help, but it’s still easy to make mistakes.

The open source project GDCoreDataConcurrencyDebugging can be added to your own projects to alert you via console messages when NSManagedObjects are accessed on the wrong thread or dispatch queue.

Below is an example of accessing an instance of NSManagedObject from the wrong context:

__block NSManagedObject *objectInContext1 = nil;
 
[context1 performBlockAndWait:^{
 
  objectInContext1 = [[NSManagedObject alloc] initWithEntity:entity 
                              insertIntoManagedObjectContext:context1];
  objectInContext1.name = @"test";
 
  NSError *saveError;
  if ([context1 save:&saveError] == NO) {
 
    NSLog(@"Error: %@", [saveError localizedDescription]);
  }
}];
 
 
// Invalid access
[context2 performBlockAndWait:^{
 NSString *name = objectInContext1.name;
}];

In the code above you’re trying to read name in context2 from an object that was originally created in context1.

If you were to run the above example using GDCoreDataConcurrencyDebugging you’d see the following console message that advises you of the problem:

2014-06-17 13:20:24.530 SampleApp[24222:60b] CoreData concurrency failure
Note: You should remove GDCoreDataConcurrencyDebugging from your app before you ship a build to the App Store as it does add a small amount of overhead that doesn’t need to be in your published app.

Core Data under iOS 8 and OS X Yosemite now has the ability to detect concurrency issues. To enable this new functionality you passing -com.apple.CoreData.ConcurrencyDebug 1 to your app on launch via Xcodeʼs Scheme Editor.

However, until you can phase out support for earlier OS versions in your app, GDCoreDataConcurrencyDebugging will keep you advised of concurrency issues during development.

The GDCoreDataConcurrencyDebugging README on Github is your best resource information on installing and using this tool.

6. CoreData-hs

CoreData-hs generates category methods to execute common fetch requests for all entities and properties in your Core Data Model. Creating these methods isn’t difficult, but it is time consuming — and every little bit of time saved coding is valuable!

For example, if your weather app had a view with the weather forecast and modeled each day’s forecast using a WFForecast entity with a timeStamp, temp, and summary attribute, CoreData-hs would create the following category for you:

#import <CoreData/CoreData.h>
#import <Foundation/Foundation.h>
@interface WFForecast (Fetcher)
 
+ (NSArray *)summaryIsEqualTo:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)summaryIsLessThan:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)summaryIsGreaterThan:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)summaryIsGreaterThanOrEqualTo:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)summaryIsLessThanOrEqualTo:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)summaryIsNotEqualTo:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)summaryIsBetwixt:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)tempIsEqualTo:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)tempIsLessThan:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)tempIsGreaterThan:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)tempIsGreaterThanOrEqualTo:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)tempIsLessThanOrEqualTo:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)tempIsNotEqualTo:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)tempIsBetwixt:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)timeStampIsEqualTo:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)timeStampIsLessThan:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)timeStampIsGreaterThan:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)timeStampIsGreaterThanOrEqualTo:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)timeStampIsLessThanOrEqualTo:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)timeStampIsNotEqualTo:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)timeStampIsBetwixt:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)summaryIsLike:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)summaryContains:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)summaryMatches:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)summaryBeginsWith:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)summaryEndsWith:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)tempIsLike:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)tempContains:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)tempMatches:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)tempBeginsWith:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
+ (NSArray *)tempEndsWith:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock;
 
@end

As you can see there are a lot of methods generated! As an example, here’s the implementation generated for tempIsGreaterThan:inContext:sortDescriptors: error::

+ (NSArray *)tempIsGreaterThan:(id)object inContext:(NSManagedObjectContext *)context sortDescriptors:(NSArray *)sort error:(void(^)(NSError *error))errorBlock {
  NSFetchRequest *fetchRequest = [NSFetchRequest fetchRequestWithEntityName:@"WFForecast"];
  [fetchRequest setPredicate:[NSPredicate predicateWithFormat:@"temp > %@", object]];
  [fetchRequest setSortDescriptors:sort];
  NSError *err = nil;
  NSArray *results = [context executeFetchRequest:fetchRequest error:&err];
  if(!results && errorBlock) {
    errorBlock(err);
    return nil;
  }
  return results;
}

Once the methods have been generated you can now use them to perform fetch requests with specific conditions. For example, if you need to fetch all WFForecast objects where the temperature is over 70° you can call tempIsGreaterThan:inContext:sortDescriptors:error: and simply pass in the target temperature as shown below:

NSSortDescriptor *sortDescriptor = [NSSortDescriptor sortDescriptorWithKey:@"temp" ascending:YES];
NSArray *results = [WFForecast tempIsGreaterThan:@(70)
                                       inContext:self.managedObjectContext
                                 sortDescriptors:@[sortDescriptor]
                                           error:^(NSError *error) {
 
  NSLog(@"Error: %@", [error localizedDescription]);
}];

You’ll get back an array of matching objects.

CoreData-hs is a lightweight utility that can save you time if you tend to generate a lot of these types of request by hand. For installation and usage instructions, consult the README on Github.

5. Core Data Editor

You can view and edit your app’s Core Data-based models from inside the GUI of Core Data Editor, which supports XML, binary and SQLite persistent store types. Beyond editing basic attributes, you can also edit and visualize data relationships. You can also use Mogenerator (discussed in item #2 below) with Core Data Editor to create your model code.

Core Data Editor is familiar with Apple’s schema and presents your data without the Z prefixes you might be familiar with if you’ve ever looked at the SQL files that Core Data generates. You can browse the contents of your app’s database in a nice table format. It also supports previewing of binary data such as pictures, and in-line editing of dates using a standard date picker:

Core Data Editor

If you need to create a seed file or just want to import data, Core Data Editor can take in a CSV file and turn it into persisted objects in Core Data as shown below:

Core Data Editor

To install Core Data Editor, download the free trial from the Thermal Core website. Uncompress the downloaded ZIP archive and move the Core Data Editor.app file to your Applications directory. The author of the app has also recently open sourced it if you want to find out how it works and make your own enhancements.

When you launch the app for the first time it will guide you through a short setup process. This process is optional but it will speed things up later if you specify, at a minimum, your iPhone Simulator directory and your Xcode derived data directory.

Note: Because you’re required to select your derived data and simulator directories in the GUI, you may run into trouble with default settings in OS X Lion and up that hide your Libraries folder.

In Mavericks OS X, you can correct this by going to your home directory in the finder and selecting View / Show View Options and checking Show Library Folder. In Lion and Mountain Lion OS X, the same thing may be accomplished by typing chflags nohidden ~/Library/ into Terminal.

More details about Core Data Editor can be found on Thermal Core’s website.

4. SQLite3

Sometimes performing SQL queries directly on the underlying Core Data SQLite database can be helpful when debugging a knotty data issue. SQLite3 is a Terminal-based front-end to the SQLite library that comes installed on all Macs and should be familiar to those with extended database experience. If you don’t have extended database experience, this probably isn’t for you.

To use SQLite3, first open Terminal and navigate to your app’s Documents directory. Depending on your install, the Documents directory will be similar to ~/Library/Application Support/iPhone Simulator/7.1-64/Applications/{your app's ID}/Documents.

Change 7.1-64 from the above command to match the version of the simulator you’re using. {your app’s ID} is automatically generated by Xcode and uniquely identifies each app installation. There’s no easy way to find out which ID is yours. You can either add logging to your app when you create the core data stack, or look for the directory that was modified most recently – this will be the app you’re currently working on :]

The documents directory will contain a file with the extension sqlite, which is your app’s database file. For apps using Apple’s core data template, the filename will match your app’s name. Open this file using the SQLite3 program as follows (the example app here is called AddressBook, your filename will be different):

$ sqlite3 AddressBook.sqlite

You’ll see the following prompt appear in the console:

SQLite version 3.7.13 2012-07-17 17:46:21
Enter ".help" for instructions
Enter SQL statements terminated with a ";"
sqlite>

Now you’re ready to perform standard SQL queries against the database.

For example, to view the schema Core Data is using, execute the following command:

sqlite> select * from sqlite_master;

SQLite responds to your query with a textual listing of the tables in the schema as follows:

table|ZMDMPERSON|ZMDMPERSON|3|CREATE TABLE ZMDMPERSON ( Z_PK INTEGER PRIMARY KEY, Z_ENT INTEGER, Z_OPT INTEGER, ZISNEW INTEGER, ZFIRSTNAME VARCHAR )
table|Z_PRIMARYKEY|Z_PRIMARYKEY|4|CREATE TABLE Z_PRIMARYKEY (Z_ENT INTEGER PRIMARY KEY, Z_NAME VARCHAR, Z_SUPER INTEGER, Z_MAX INTEGER)
table|Z_METADATA|Z_METADATA|5|CREATE TABLE Z_METADATA (Z_VERSION INTEGER PRIMARY KEY, Z_UUID VARCHAR(255), Z_PLIST BLOB)
sqlite>

The Z prefixes on all of the table columns are part of Core Data’s underlying use of SQLite. For analysis purposes, it’s safe to ignore them.

Note: You should never write to the SQLite Core Data database directly. Apple can modify the underlying structure at any time.

If you truly have a need to directly manipulate the SQLite database in a production application, you should forgo Core Data and use raw SQL access instead. There are several popular frameworks to help you manage SQL implementation in your apps, including FMDB and FCModel.

If you’re just analyzing your data, there’s nothing wrong with poking around the SQLite database file — just don’t modify its contents.

One example of using direct SQL to analyze your data is grouping and counting distinct attributes to see the diversity of your attributes.

For example, if you have a sample address book app and want to know how many of your contacts live in each city, you could execute the following command at the SQLite3 prompt:

SELECT t0.ZCITY, COUNT( t0.ZCITY ) FROM ZMDMPERSON t0 GROUP BY t0.ZCITY

SQLite would respond with the count of each distinct city in your address book database, as shown in the example below:

San Diego|23
Orlando|34
Houston|21

To exit the SQLite3 terminal program, simply execute the following command:

sqlite> .exit

For more information on SQLite3, view its man page by opening Terminal and executing the command man sqlite3.

3. MDMCoreData

MDMCoreData (disclaimer – this library is written by me!) is a collection of open source classes that make working with Core Data easier. It doesn’t try to hide or abstract Core Data, but instead enforces best practices and reduces the amount of boilerplate code required. It’s a better alternative than the Xcode Core Data Template.

MDMCoreData consists of the following four classes:

  • MDMPersistenceController – A handy controller that sets up an efficient Core Data stack with support for creating multiple child-managed object contexts. It has a built-in private managed object context that saves asynchronously to a SQLite store.
  • MDMFetchedResultsTableDataSource – Implements the fetched results controller delegate and a table data source.
  • MDMFetchedResultsCollectionDataSource – Implements the fetched results controller delegate and a collection data source.
  • NSManagedObject+MDMCoreDataAdditions – A category on managed objects providing helper methods for eliminating boilerplate code such as entity names.

One great feature of MDMCoreData is that it comes with a Core Data backed table data source — so you don’t have to worry about implementing one yourself.

Instead of implementing all the required methods in the UITableViewDataSource and NSFetchedResultsControllerDelegate protocol, you can just set your table’s data source to an instance of MDMFetchedResultsTableDataSource. When instantiating the MDMFetchedResultsTableDataSource object you simply pass in the table view and a fetched results controller:

- (void)viewDidLoad {
  [super viewDidLoad];
 
  self.tableDataSource = [[MDMFetchedResultsTableDataSource alloc] initWithTableView:self.tableView
                                                        fetchedResultsController:[self fetchedResultsController]];
  self.tableDataSource.delegate = self;
  self.tableDataSource.reuseIdentifier = @"WeatherForecastCell";
  self.tableView.dataSource = self.tableDataSource;
}

A MDMFetchedResultsTableDataSource does have a delegate, with two methods that must be implemented. One method configures the cell for your table:

- (void)dataSource:(MDMFetchedResultsTableDataSource *)dataSource
     configureCell:(id)cell
        withObject:(id)object {
 
  OWMForecast *forecast = object;
 
  UITableViewCell *tableCell = (UITableViewCell *)cell;
  tableCell.textLabel.text = forecast.summary;
  tableCell.detailTextLabel.text = forecast.date;
}

The second method handles deletions:

- (void)dataSource:(MDMFetchedResultsTableDataSource *)dataSource
      deleteObject:(id)object
       atIndexPath:(NSIndexPath *)indexPath {
 
  [self.persistenceController.managedObjectContext deleteObject:object];
}

It’s far easier to implement the two required methods of MDMFetchedResultsTableDataSource than to implement all of the methods required by the table data source and fetch results controller protocols.

You can find out more about MDMCoreData at the MDMCoreData Github repository.

2. Mogenerator

Since Core Data comes with full support for key-value coding (KVC) and key-value observing (KVO), there is no requirement to implement custom NSManagedObject classes. You can get by using setValue:forKey: and valueForKey: when reading or writing attributes on your entities. But this tends to be cumbersome and hard to debug since strings can’t be checked at compile time for correctness.

For example, if you had a person Core Data entity, you could read and write attributes like this:

NSString *personName = [person valueForKey:@"firstName"];
[person setValue:@"Ned" forKey:@"firstName"];

The person object above is an instance of NSManagedObject with an attribute named firstName. To read firstName, you use valueForKey: with the key firstName. Similarly, to set the first name of a person object you can use setValue:forKey:.

A better approach is to use standard accessor methods or dot syntax; however, to do this you must implement a custom subclass of NSManagedObject for your entities. This lets you add model logic such as fetch requests and validation.

You have probably used Xcode’s Create NSManagedObjectSubclass functionality to quickly create a subclass for a single entity. Although it’s a nice shortcut, it can create extra overhead if you have a large model and can cause you grief when your model changes.

Re-creating the subclass means wiping out all of your custom model logic — which means you should host that logic outside of your custom model. This lends itself to a common pattern of creating custom subclasses with managed object properties along with categories for custom model logic.

The command line tool Mogenerator automates these exact tasks for you. It generates two classes per Core Data entity. The first class is for machine consumption and is continuously overwritten as the model changes. The second class is for all your custom logic and is never overwritten.

Mogenerator has a list of other benefits which include the following:

  • No need to use NSNumber objects when reading or writing numeric attributes.
  • Helper methods for working with sets.
  • Helper methods for creating new entities
  • A method for entity identification.

Mogenerator can be installed from the DMG available on the Mogenerator website, or alternatively through Homebrew. To install Mogenerator using Homebrew, open Terminal and run the following command:

brew install mogenerator

Once installed, use the cd command to change to your app’s directory, then run Mogenerator from Terminal like so:

$ mogenerator -m MySampleApp/ExampleModel.xcdatamodeld -O MySampleApp/Model --template-var arc=true

In the command above, you call Mogenerator followed by the location of your model with the -m option. You can also specify where the generated classes should be located with the -O option. When working with ARC you should also pass the --template-var arc=true option.

You can make Xcode run Mogenerator for you by creating a Run Script Build Phase. Build Phases are descriptions of tasks that need to be performed by Xcode during a build.

To add a Build Phase, first select the target, select the Build Phases tab, then select Editor / Add Build Phase / Add Run Script Build Phase from the menu.

Add the following code in the Shell script text area under the new Run Script, making sure to modify the parameters to mogenerator as suits your project:

if [ "${CONFIGURATION}" == "Debug" ]; then
echo "Running Mogenerator"
mogenerator -m MySampleApp/ExampleModel.xcdatamodeld -O MySampleApp/Model --template-var arc=true
echo "Finished Mogenerator"
else
echo "Skipping Mogenerator"
fi

The above run script will cause Xcode to run Mogenerator every time you run a debug build command. If there are no changes to the model, Mogenerator will do nothing and exit.

Now that you have incorporated Mogenerator into your workflow for quick subclass generation, you should take advantage of its other features.

For example, instead of unwrapping primitive values every time you can just add the suffix Value on to them as illustrated by the following code snippet:

// Without Mogenerator
if ([person.isFriend boolValue]) {
  // Do some work
}
 
// With Mogenerator
if (person.isFriendValue) {
  // Do some work
}

Since bool types are stored as NSNumber in Core Data, you must call boolValue on the person object before checking if the value is true. With Mogenerator, that extra step is no longer required as you can simply call isFriendValue.

If Mogenerator looks like a useful addition to your toolbox, you can find more information on Mogenerator at its Github repository.

1. Instruments

Instruments is the tool of choice for investigating almost all performance and memory issues on OS X and iOS — including Core Data issues. The other tools in this list offer a lot of automation and convenience, but Instruments will typically be your first stop when investigating any issues or doing any performance tuning.

The Time Profiler and Core Data templates, shown below, are the most useful for Core Data profiling:

Screenshot 2014-07-22 11.16.37

The default Core Data template, with the optional Faults Instrument feature added in, provides the following features to help you tune and monitor your app’s performance:

  • Core Data Fetches Instrument — Captures fetch count and duration of fetch operations.
  • Core Data Cache Misses Instrument — Captures information about fault events that result in cache misses.
  • Core Data Saves Instrument — Captures information on managed object context save events.
  • Core Data Faults Instrument — Captures information on fault events that occur during lazy initialization of NSManagedObjects or relationships.

Here is a typical instruments profile from a Core Data app. You can see when fetch requests are occurring and how long they take, when and how often save operations happen and whenever faults are being fired:

Screenshot 2014-07-22 11.22.07

For more information about Instruments, check out our tutorial on How to Use Instruments in Xcode.

Where To Go From Here?

Core Data is a powerful framework, but it comes with a lot of development overhead. However, the tools and libraries in this article give you some methods to help you efficiently and effectively tackle that overhead.

Again, if you want to learn about how to use Core Data with Swift, check out our upcoming book Core Data by Tutorials, which is fully updated for iOS 8 and Swift.

In the meantime, if you have any other Core Data tools or libraries you really would like to recommend to others, please join the forum discussion below!

Top 10 Core Data Tools and Libraries is a post from: Ray Wenderlich

The post Top 10 Core Data Tools and Libraries appeared first on Ray Wenderlich.

Xcode6-Beta5 Swift Tutorial Updates

$
0
0
Tutorials updated for Xcode6-beta5!

Tutorials updated for Xcode6-beta5!

Good news – we’ve updated all of our Swift tutorials and other posts to Xcode6-beta5!

Here’s what’s we updated:

The rest of the Swift tutorials have been confirmed to work on Xcode6-beta5 with no changes required.

What Changed?

For the most part, updating everything was pretty straightforward – mostly related to the following changes:

  • No longer being able to use optionals in boolean statements (i.e. must compare to nil explicitly)
  • No longer being able to use += to append an item to an array
  • Sometimes having to put override in front of initializers, or required initializers
  • Casting to NSString rather than using bridgeToObjectiveC()
  • Some type/protocol name changes

For a nice roundup of Xcode6-beta5 changes, check out the release notes and this handy post.

Where To Go From Here?

We’ll do our best to continue to keep things up-to-date until things finally settle down with the GM release.

It’s definitely been a challenge dealing with all these changes – especially with our upcoming books. :]

We’re just doing our best to try to keep everything up-to-date and functional for you guys, we hope it helps!

If you notice any issues we missed, please let us know so we can keep everything functional. Thanks!

Xcode6-Beta5 Swift Tutorial Updates is a post from: Ray Wenderlich

The post Xcode6-Beta5 Swift Tutorial Updates appeared first on Ray Wenderlich.

Swift Table View Animations Tutorial: Drop-In Cards

$
0
0
Rotate table cell

Add some flair to your tables.

Update note: This tutorial was updated for iOS 8 and Swift by Ray Fix. Original post by Tutorial Team member Brian Broom and Code Team member Orta Therox.

The standard UITableView is a powerful and flexible way to present data in your apps; chances are that most apps you write will use table views in some form. However, one downside is that without some level of customization, your apps will look bland and blend in with the thousands of apps just like it.

To prevent boring table views, you can add some subtle animation to liven up the actions of your app. You may have seen this in the Google+ app where the cards fly in through the side with a cool animation. If you haven’t seen it yet, download it here (it’s free)! You might also want to check out the design guidelines that Google released at the 2014 I/O conference. It contains many tips and examples on how to use animation well.

In this table view animations tutorial, you’ll be using Swift to enhance an existing app to rotate the cells of a table as you scroll. Along the way, you’ll learn about how transforms are achieved in UIKit, and how to use them in a subtle manner so as not to overwhelm your user with too much happening on-screen at once. You will also get some advice on how to organize your code to keep responsibilities clear and your view controllers slim.

Before beginning, you should know how to work with UITableView and the basics of Swift. If you need an introduction to these topics, you might want to start with the Swift Tutorial series that will teach you the basics of Swift in a table view app.

At publication time, our understanding is we cannot post screenshots of iOS 8 while it’s in beta. Any screenshots shown are from iOS 7, which will be close to what you see in iOS 8.

Getting Started

Download the starter project and open it up in Xcode 6. You’ll find a simple storyboard project with a UITableViewController subclass (MainViewController) and a custom UITableViewCell (CardTableViewCell) for displaying team members. You will also find a model class called Member that encapsulates all of the information about a team member and knows how to fetch that information from a JSON file stored in the local bundle.

Build and run the project in the simulator; you’ll see the following:

Starter project

A perfectly good design, ready to be spiced up.

The app is off to a good start, but it could use a little more flair. That’s your job; you’ll use some Core Animation tricks to animate your cell.

Define the Simplest Possible Animation

To get the basic structure of the app going you’ll start by creating a super simple fade-in animation helper class. Go to File\New\File… and select type iOS\Source\Swift File for an empty Swift file. Click Next, name the file TipInCellAnimator.swift and then click Create.

Replace the file contents with the following:

import UIKit
 
class TipInCellAnimator {
  // placeholder for things to come -- only fades in for now
  class func animate(cell:UITableViewCell) {
    if let view = cell.contentView {
      view.layer.opacity = 0.1
      UIView.animateWithDuration(1.4) {
        view.layer.opacity = 1
      }
    }
  }
}

This simple class provides a method that takes a cell, gets its contentView and sets the layer’s initial opacity to 0.1. Then, over the space of 1.4 seconds, the code in the closure expression animates the layer’s opacity back to 1.0.

Note: A Swift closure is just a block of code that can also capture external variables. For example, { } is a simple closure. Functions declared with the func keyword are just examples of named closures. It is even perfectly legal to declare functions inside of other functions in Swift.

If you pass a closure as the last argument of a function, you can use the special trailing closure syntax and move the closure outside of the function call. You can see this in the UIView.animateWithDuration() call.

You can read more about closures in the Swift Programming Language book or read about the rich history of closures on Wikipedia.

Now that you have the animation code ready, you need the table view controller to trigger this new animation as the cells appear.

Trigger the Animation

To trigger your animation, open MainViewController.swift and add the following method to the class:

override func tableView(tableView: UITableView!, willDisplayCell cell: UITableViewCell!,
    forRowAtIndexPath indexPath: NSIndexPath!) {
  TipInCellAnimator.animate(cell)
}

This method is declared in the UITableViewDelegate protocol and gets called just before the cell is shown on the screen. It calls the TipInCellAnimator‘s class method animate() on each cell as it appears to trigger the animation.

Build and run your app. Scroll through the cards and watch the cells slowly fade in:

SwiftDrop-InFadeAnimation

Getting Fancy with Rotation

Now it’s time to make the app a little more fancy with some animated rotation. This works in the same way as the fade-in animation, except you are specifying both start and end transformations.

Open TipInCellAnimator.swift, and replace its contents with:

import UIKit
import QuartzCore // 1
 
class TipInCellAnimator {
  class func animate(cell:UITableViewCell) {
    if let view = cell.contentView {
      let rotationDegrees: CGFloat = -15.0
      let rotationRadians: CGFloat = rotationDegrees * (CGFloat(M_PI)/180.0)
      let offset = CGPointMake(-20, -20)
      var startTransform = CATransform3DIdentity // 2
      startTransform = CATransform3DRotate(CATransform3DIdentity,
        rotationRadians, 0.0, 0.0, 1.0) // 3
      startTransform = CATransform3DTranslate(startTransform, offset.x, offset.y, 0.0) // 4
 
      // 5
      view.layer.transform = startTransform
      view.layer.opacity = 0.8
 
      // 6
      UIView.animateWithDuration(0.4) {
        view.layer.transform = CATransform3DIdentity
        view.layer.opacity = 1
      }
    }
  }
}

This time the animation is quicker (0.4 seconds), the fading in is more subtle, and you get a nice rotation effect. The key to the above animation is defining the startTransform matrix and animating the cell back to its natural identity transformation. Let’s dig into that and see how it was done:

  1. This class now requires QuartzCore to be imported because it uses core animation transforms.
  2. Start with an identity transform, which is a fancy math term for “do nothing.” This is a view’s default transform.
  3. Call CATransform3DRotate to apply a rotation of -15 degrees (converted to radians), where the negative value indicates a counter-clockwise rotation. This rotation is around the axis 0.0, 0.0, 1.0; this represents the z-axis, where x=0, y=0, and z=1.
  4. Applying just the rotation to the card isn’t enough, as this simply rotates the card about its center. To make it look like it’s tipped over on a corner, add a translation or shift where the negative values indicate a shift up and to the left.
  5. Set this rotated and translated transform as the view’s initial transform.
  6. Animate the view back to it’s original values.

Notice how you were able to build up the final transformation one step at a time as shown in the image below:

Building up transformations

Building up transformations to produce the desired effect.

Note: An arbitrary chain of transformations can ultimately be represented by one matrix. If you studied matrix math in school, you may recognize this as multiplication of matrices. Each step multiplies a new transformation until you end up with the final matrix.

You’ll also notice that you’re transforming a child view of the cell, and not the cell itself. Rotating the actual cell would cause part of it to cover the cell above and below it, which would cause some odd visual effects, such as flickering and clipping of the cell. A cell’s contentView contains all of its constituent parts.

Not all properties support animation; the Core Animation Programming Guide provides a list of animatable properties for your reference.

Build and run your application. Watch how the cells tilt into view as they appear!

Rotation of card cell.

A Swift Refactor

The original Objective-C version of this tutorial made sure to compute the starting transform once. In the above version of the code it is computed each time animate() gets called. How might you do this in Swift?

One way is to use an immutable stored property that is computed by calling a closure. Replace the contents of TipInCellAnimator.swift with:

import UIKit
import QuartzCore
 
let TipInCellAnimatorStartTransform:CATransform3D = {
  let rotationDegrees: CGFloat = -15.0
  let rotationRadians: CGFloat = rotationDegrees * (CGFloat(M_PI)/180.0)
  let offset = CGPointMake(-20, -20)
  var startTransform = CATransform3DIdentity
  startTransform = CATransform3DRotate(CATransform3DIdentity,
    rotationRadians, 0.0, 0.0, 1.0)
  startTransform = CATransform3DTranslate(startTransform, offset.x, offset.y, 0.0)
 
  return startTransform
}()
 
class TipInCellAnimator {
  class func animate(cell:UITableViewCell) {
    if let view = cell.contentView {
 
      view.layer.transform = TipInCellAnimatorStartTransform
      view.layer.opacity = 0.8
 
      UIView.animateWithDuration(0.4) {
        view.layer.transform = CATransform3DIdentity
        view.layer.opacity = 1
      }
    }
  }
}

Notice the code that generates the startTransform is now in its own stored property TipInCellAnimatorStartTransform. Rather than defining this property with a getter to create the transform each time its called, you set its default property by assigning it a closure and following the assignment with the empty pair of parenthesis. The parenthesis force the closure to be called immediately and assign the return value to the property. This initialization idiom is discussed in Apple’s Swift book in the chapter on initialization. See “Setting a Default Property Value with a Closure or Function” for more information.

Note: It would have been nice to make TipInCellAnimatorStartTransform a class property of TipInCellAnimator but as of this writing, class properties are not yet implemented in Swift.

Adding Some Limits to Your Transformation

Although the animation effect is neat, you’ll want to use it sparingly. If you’ve ever suffered through a presentation that overused sound effects or animation effects, then you know what effect overload feels like!

In your project, you only want the animation to run the first time the cell appears — as it scrolls in from the bottom. When you scroll back toward the top, the cells should scroll without animating.

You need a way to keep track of which cards have already been displayed so they won’t be animated again. To do this, you’ll use a Swift Dictionary collection that provides fast key lookup.

Note: A set is an unordered collection of unique entries with no duplicates, while an array is an ordered collection that does allow duplicates. The Swift Standard Library does not currently include a Set type, but it can easily be simulated with a Dictionary of Bools. The abstraction penalty for using a Dictionary in this way is very small, which is probably why the Swift team left it out of the initial release.

The general disadvantage of a set or dictionary keys is that they don’t guarantee an order, but the ordering of your cells is already handled by the data source, so it isn’t an issue in this case.

Open MainViewController.swift and add the following property to the class:

var didAnimateCell:[NSIndexPath: Bool] = [:]

This declares an empty dictionary that takes NSIndexPaths as keys and Bools as values. Next, replace the implementation of tableView(tableView:, willDisplayCell:, forRowAtIndexPath:) with the following:

override func tableView(tableView: UITableView!, willDisplayCell cell: UITableViewCell!, 
                        forRowAtIndexPath indexPath: NSIndexPath!) {        
    if didAnimateCell[indexPath] == nil || didAnimateCell[indexPath]! == false {
        didAnimateCell[indexPath] = true
        TipInCellAnimator.animate(cell)
    }
}

Instead of animating every cell each time it appears as you scroll up and down the table, you check to see if the cell’s index path is in the dictionary. If it isn’t, then this is the first time the cell has been displayed; therefore you run the animation and add the indexPath to your set. If it was already in the set, then you don’t need to do anything at all.

Build and run your project; scroll up and down the tableview and you’ll only see the cards animate the first time they appear on-screen.

Drop-In-UpDownScroll

Where To Go From Here?

In this tutorial you added animation to a standard view controller. The implementation details of the animation were kept out of the MainViewController class and instead put into a small, focused, animation helper class. Keeping class responsibilities focused, particularly for view controllers, is one of the main challenges of iOS development.

You can download the final project for this tutorial here.

Now that you’ve covered the basics of adding animation to cells, try changing the values of your transform to see what other effects you can achieve. Some suggestions are:

  1. Faster or slower animation
  2. Larger rotation angle
  3. Different offsets; if you change the rotation angle, you will likely need to change the offset to make the animation look right. What does the animation look like if you drop the offset entirely and use (0, 0) for the parameters?
  4. Go nuts and create some whacked-out transforms.
  5. Advanced: Can you get the card to rotate along the horizontal or vertical axis? Can you make it look like it flips over completely?
  6. Advanced: Add an else clause to tableView(tableView:, willDisplayCell:, forRowAtIndexPath:) and perform a different animation when cells are displayed a second time.
Crazy Rotations

Crazy rotations that you can (but maybe shouldn’t) apply to your cells. See if you can duplicate these!

A great exercise is to try and identify animations in your favorite apps. Even with the simple animation from this tutorial, there are countless variations you can produce on this basic theme. Animations can be a great addition to user actions, such as flipping a cell around when selected, or fading in or out when presenting or deleting a cell.)

If you have any questions about this table view animations tutorial or this technique in general, please join the forum discussion below!

Swift Table View Animations Tutorial: Drop-In Cards is a post from: Ray Wenderlich

The post Swift Table View Animations Tutorial: Drop-In Cards appeared first on Ray Wenderlich.

Swift Ninja Programming Challenge: Winners Announced!

$
0
0
Find out who is the ultimate Swift Ninja!

Find out who is the ultimate Swift Ninja!

We recently had a 2-part “Swift Ninja” programming challenge:

  • The first part of this series contains programming challenges involving default values in functions, variadic parameters, map/reduce, advanced switch statement features, and more.
  • The second part of the series contains programming challenges involving recursion, operator overloading, lazy evaluation, currying, and more.

Be sure to try the challenges if you haven’t already. They’re a great way to practice your new Swift skills, and you’ll learn a ton!

At the end of the second part, we had a special final challenge that was open to competition among readers for fame and fortune.

Well, we’ve had a ton of great entries, and today we’re going to announce the winner!

The Final Challenge

Let’s start by reviewing the final challenge:

Given the structures:

enum Suit {
    case Clubs, Diamonds, Hearts, Spades
}
 
enum Rank {
    case Jack, Queen, King, Ace
    case Num(Int)
}
 
struct Card {
    let suit: Suit
    let rank: Rank
}

Write a function called countHand that takes in an array of Card instances and counts the total value of the cards given. The requirements for your solution are as follows:

  • The function returns the value of the cards in the hand as an Int.
  • Does not use loops or nested functions.
  • Any Ace preceded by 5 of Diamonds is worth 100 points.
  • Any odd numeric card (3, 5, 7, 9) of any suit’s worth the double of its rank value in points when immediately preceded in the hand by any card of the Hearts.

The Participants

A number of brave ninjas submitted their solutions to the final challenge. Feast your eyes upon their code and praise their skills:

I recommend you check out all solutions – they are way more diverse than I initially expected. Nice job everyone and thank you very much for participating!

I considered a number of aspects while choosing the winner so let me share some of my observations while I was evaluating the code.

Correctness

All provided solutions produced correct results – congratulations everyone! I know the solution requirements were a bit strange, but it was only because I wanted to challenge you into creatively solve how to keep track of the last card’s value.

Some of the participants figured out that the first card in the hand never brings any value to the result. Even though their solutions aren’t producing a “more correct” result, it’s a nice optimization catch – kudos to Dojomaroc, Epinaud, and Tomek.

Brevity

Most of the participants embraced the code style of the article and produced tight and clean solutions. I personally like the solutions by Dojomaroc and Pasil – both of them kept the function code to a single return statement where a single switch plays the main role. Nice job!

Marzapower provided the longest but most flexible solution (check it out, well laid down busines logic there). He bravely went against my “brevity” requirement and receives an honorable mention for that!

Use of Swift features

Even though use of advanced Swift doesn’t always result in the fastest or shortest solution, it’s always interesting to see what other programmers create with this amazing language.

Terkans used the most exotic structure from the Swift standard library (that I must admit haven’t heard before I saw his code): Zip2.

Zip2‘s init takes in two SequenceType values and combines them into a single Sequence made of the values bound into tuples. Cool! Check his solution for an example how to use Zip2. Terkans also used the + as the closure param to Array.reduce(), which I liked very much.

Readability

Dojomaroc‘s solution was clearly the most readable just because he included a ton of comments. This is the kind of code I personally would love to see if I jump in on a project mid-way. Thumbs up!

Performance

Dojomaroc gave me an interesting idea with his comment, namely to measure the performance of each solution. Interestingly no the same solution performs best when I tested the solutions on a debug build and on a release build. I tested all solutions with the same card hand and letting each solution roll 100,000 times.

Dojomaroc and Tomek didn’t use optionals so their solutions came out first in the debug build measures (Dojomaroc’s solution sometimes coming few milliseonds faster). However the races between release builds came out quite different (lower result is better):

results_speed

The fastest solution in the test clearly is Pasil‘s – thumbs up!

The Runner Up

It was extremely hard choosing between these solutions; I was very impressed with what everyone came up with.

In the end, it came down to two – and since it was so hard to decide, I wanted to give the runner up a prize as well.

The runner up is Pasil – congratulations! Ray will be in touch soon to deliver your prize – a free PDF of your choice from this site.

Here’s Pasil’s solution:

func countHand(cards: [Card]) -> Int {
  return cards.reduce((nil, 0)) { (prev: (card: Card?, sum: Int), card: Card) in
    if let prevCard = prev.card {
      switch (prevCard.suit, prevCard.rank, card.rank) {
      case (.Diamonds, .Num(5), .Ace):
        return (card, prev.sum + 100)
      case (.Hearts, _, .Num(let value)) where value % 2 == 1:
        return (card, prev.sum + 2 * value)
      default:
        return (card, prev.sum)
      }
    }
    return (card, 0)
  }.sum
}

And the winner is…

Dojomaroc is the ultimate Swift ninja!

Dojomaroc is the ultimate Swift ninja!

At long last, it’s time to announce the ultimate Swift Ninja: Dojomaroc!

It was a really close call but in the end Dojomaroc’s solution was just a tiny bit better overall.

The fact that he ignored the first card in the hand and thanks to that he didn’t have to use any optionals made his code clearer to read and fast both on a debug and release builds.

Congratulations Dojomaroc! Ray will be in touch soon to deliver your prize – a free copy of our three brand new Swift books coming out later this year.

Here’s Dojomaroc’s solution:

func countHand(cards: [Card]) -> Int {
 
  // If the number of cards is inferior to 2, the value of the hand is automatically 0
  // Note that the dropFirst() function doesn't mutate the cards array, 
  // so cards[0] can be passed as the initial value of "reduce()".
  // Only cards that have a preceding card need to be evaluated by "reduce()", 
  // that's why reduce() is applied to dropFirst(cards).
  // "previous" is a tuple that keeps track of (the total value of the hand, and the last card evaluated by "reduce()")
 
  return cards.count < 2 ? 0 : dropFirst(cards).reduce((0, cards[0])) { (previous: (totalValue: Int, card: Card), currentCard: Card) in
 
    // For the previous card, we are interested in its rank and suit. 
    // As for the current card we are only concerned with its rank.
 
    switch (previous.card.rank, previous.card.suit, currentCard.rank) {
 
      // Statistically speaking: to have an odd numeric card (3, 5, 7, 9) 
      // preceded by any rank of Hearts is more probable than
      // having an Ace preceded by 5 of Diamonds . 
      // That explains the value associated (100), and more importantly 
      // the order in which the first two cases are evaluated. 
 
      case ( _, .Hearts, .Num(let rank)) where rank % 2 == 1:
        return (previous.totalValue + 2 * rank, currentCard)
 
      case (.Num(5), .Diamonds, .Ace):
        return (previous.totalValue + 100, currentCard)
 
      default:
        return (previous.totalValue, currentCard)
 
    }}.totalValue // Finally, return .totalValue of the tuple "previous"
}

Where to go from here?

I hope you enjoyed the Swift Ninja programming challenge and that you learned more about Swift along the way. Don’t stop exploring Swift – it is shaping to become a really fantastic language!

If you want to get really deep insight into the Swift development process and get notified on cool new features you can:

If you have ideas for further Ninja challenges or just want to share your thoughts on the challenge and solutions provided, please leave a comment below!

Swift Ninja Programming Challenge: Winners Announced! is a post from: Ray Wenderlich

The post Swift Ninja Programming Challenge: Winners Announced! appeared first on Ray Wenderlich.

How to Port Your Sprite Kit Game from iOS to OS X

$
0
0
Learn how to port your Sprite Kit games from iOS to OS X!

Learn how to port your Sprite Kit games from iOS to OS X!

Have you ever wished you could port your iOS Sprite Kit game to OS X? Surprisingly, it’s easier than you think.

Apple developed Sprite Kit with the goal of keeping iOS and Mac OS X development as identical as possible – which makes it extremely easy to develop for both environments simultaneously.

This tutorial will show you how to take an existing iOS Sprite Kit game — the finished project from Sprite Kit Tutorial for Beginners — and adapt it to Mac OS X.

You’ll learn how to maintain both versions in a single Xcode project and how to continue development of your game without copying and pasting heaps of code to keep both versions up to date.

It’s recommended that you have some familiarity with Mac OS X Development if you choose to work through this tutorial. However, one of the wonderful features of Sprite Kit is that you don’t need to have a lot of prior experience to turn out a great app.

If do want to learn more about Mac development, you can check out this three part tutorial on making a simple Mac app.

Getting Started

Download the starter project for this tutorial here, and build and run on your iPhone to try it out.

The starter project has a few changes from the original in order to showcase several differences between the iOS and OS X versions of the game.

Accelerometer Support

This project borrows accelerometer support from the Space Game Starter Kit which lets the player move their character up and down the screen by tilting the device, like so:

2014-04-08 13_56_00

More Destructive Power

The ninja star has now been imbued with magical ninja powers of epic destruction!

2014-04-08 14_20_49

Ok, maybe not that epic, but this version uses a nice particle effect to give the ninja star some visual impact.

Small Technical Improvements

Vicki Wenderlich lent her talents to the game and added a new tiled background which makes the game more aesthetically pleasing. As well, the status bar is hidden and the app now supports iPad resolutions.

Working with Projects and Targets

A project file ties all of your working files together, but what determines how your project is compiled is a target.

A single project can contain multiple targets. This lets you compile your project in several different ways, depending on which files are associated with a specific target and the specific build settings of the target. This should give you a clue as to how you’re going to set up this project to compile for OS X!

Let’s take a look at this project’s targets. Open the SpriteKitSimpleGame project, and at the top of the left panel, select your project to show the project settings. Then select SpriteKitSimpleGame in the project window.

Right now there are two targets, one for iOS and the other for iOS unit tests, as shown below:

Screen Shot 2014-04-08 at 10.26.37 pm

Alternatively, you can select SpriteKitSimpleGame from the Target list if the project and target list are both expanded:

Screen Shot 2014-04-13 at 11.32.19 am

When you build an application, you build it for the device or devices supported by the target. To see which devices the current target supports, click the SpriteKitSimpleGame drop down next to the Stop button at the top-left of the window, as shown below:

Screen Shot 2014-04-08 at 10.30.12 pm

This is where you’ll add a build target for Mac OS X.

Adding a Build Target for Mac OS X

Ensure your Xcode project is the active window and select File \ New \ Target as shown below:

Screen Shot 2014-04-08 at 10.33.15 pm

Xcode prompts you to select a template for your new target.

Select the OS X\Application\SpriteKit Game template and click Next, as shown below:

Screen Shot 2014-04-08 at 10.36.00 pm

Finally, type in the product name as SpriteKitSimpleGameMac and click Finish, like so:

Screen Shot 2014-04-08 at 10.37.25 pm

Note: If you plan to release your app on the Mac App Store, your bundle identifier must be registered separately on developer.apple.com. Profiles and provisioning are arranged separately for iOS and Mac OS X developer programs.

To try running your app on OS X, select SpriteKitSimpleGameMac from the scheme list, then select My Mac 64-bit as shown below:

Screen Shot 2014-04-08 at 10.43.07 pm

What do you think will happen when you try to build and run your project using this new target?

  • A — The game will run perfectly — and that’s the end of the tutorial!
  • B — The game will fail to compile with numerous errors.
  • C — The game will compile but crash on launch.
  • D — The game will compile and run but won’t be the game you expected.

Solution Inside: Answer SelectShow>

Structuring Your Files for Multiple Targets

Now that you have a Mac OS X build target, you’ll be modifying and adding files to make the app work under OS X. Keeping track of these files can be difficult as time goes on, so it’s best to set up a system right now to help keep all the files organized.

Minimize all of your groups and add a new group named SharedResources as shown below:
Screen Shot 2014-04-08 at 11.02.12 pm

This group will be the main location for your game and will contain resources that are common to both Mac OS X and iOS targets.

While you’re at it, create a separate group named Testing and move the test modules into it, like so:

Screen Shot 2014-04-08 at 11.08.13 pm

Keeping the unit tests in a separate folder helps avoid visual clutter.

Now that you have your new organized structure for the various files in your project, you’ll need to move the files into the appropriate locations.

Expand SharedResources and SpriteKitSimpleGame. Click and drag the Particles and Sounds groups from SpriteKitSimpleGame to SharedResources.

Next, drag over the sprites.atlas folder, MyScene.h, MyScene.m, GameOverScene.h and GameOverScene.m. Your file structure should look like the one shown below:

Screen Shot 2014-04-08 at 11.17.26 pm

Delete the Spaceship.png file — you won’t need that any longer. This is just a boilerplate file that is automatically added when you create a game with the Sprite Kit template.

All the shared files of your Sprite Kit game now reside in the SharedResources group. Everything left in the SpriteKitSimpleGame relates to launching and managing the game on iOS.

Expand the SpriteKitSimpleGameMac group. You’ll need to remove the example game files from this group before you progress any further.

Delete MyScene.h, MyScene.m and Spaceship.png file from the SpriteKitSimpleGameMac group and select Move to Trash. Your file list should look like so:

Screen Shot 2014-04-08 at 11.22.52 pm

Note: You may have noticed that the Mac version of your game does not contain a ViewController class; instead, it only has an AppDelegate. The UIViewController class is part of UIKit which is not available on Mac OS X. Instead, the AppDelegate creates an instance of NSWindow which will present your Sprite Kit scene.

As a final check, your fully-expanded file list should look like the following:

Screen Shot 2014-04-08 at 11.27.35 pm

You’ve removed unnecessary files from the project and organized it neatly. Now it’s time to modify your targets to let the compiler know which files to include with each target.

Adding Target Membership

Expand the Frameworks group and select UIKit.framework, like so:

Screen Shot 2014-04-08 at 11.39.04 pm

Expand the right Utilities Panel select File Inspector, like so:

Screen-Shot-2014-04-08-at-11.40.23-pm-A

About halfway down the File Inspector you’ll see the Target Membership section. This is where you select the targets that will use this file.

UIKit is only available on the iOS platform, so leave it unchecked on your Mac OS X targets, as shown below:

Screen Shot 2014-04-08 at 11.40.54 pm

The Cocoa Framework is only available on Mac OS X so ensure it’s checked for your Mac targets and unchecked for your iOS targets like so:

Screen Shot 2014-04-08 at 11.49.11 pm

The Sprite Kit Framework is available to iOS and Mac OS X so set the target membership as below:

Screen Shot 2014-04-08 at 11.50.20 pm

Each individual file in your project has its own target membership with the exception of texture atlases. The atlas itself is the only place you need to set a target membership and all contained textures will be automatically included.

However, classes work a little differently. You can’t set a target membership on a .h file — instead, you must set the target membership on the .m file.

Armed with your new-found understanding of target membership, work through each file in your SharedResources group and make sure both SpriteKitSimpleGame and SpriteKitSimpleGameMac are ticked on each file. In total, you should need eight ticks to get the job done.

Next, work through each file in your SpriteKitSimpleGame group and make sure that only SpriteKitSimpleGame is ticked for each — they should all be set correctly at this point, but it’s good to check.

Finally, work through each file in SpriteKitSimpleGameMac group and make sure that only SpriteKitSimpleGameMac files are ticked. Again, you shouldn’t have to change any but it never hurts to check.

Now that your project is properly set up for your iOS and Mac targets, you can get down to what you’re good at — writing code!

Getting the Game to Build and Run

As it stands right now, your project will still build and run without issue for iOS. The changes you just made have no real effect on the existing game. However, if you build and run the Mac target, you’ll see a bunch of errors. That’s because you haven’t yet accounted for the differences between iOS and OS X.

Build and run your project using the SpriteKitSimpleGameMac target; what do you see?

You’ll receive the error Module ‘CoreMotion’ not found. Mac OS X doesn’t have a CoreMotion class or its equivalent; you’ll have to work around this issue and use the keyboard to control player movement. However, your primary goal is to get the project to a buildable state before you worry about implementation details like that.

But how will you fix this? You can’t just remove the line of code referring to CoreMotion, otherwise the iOS version will break. You can’t work around this by using an if statment, since the compiler will still check each line of the code and throw an error if it doesn’t recognize something.

Open MyScene.m and replace:

@import CoreMotion;

with the following code:

#if TARGET_OS_IPHONE
@import CoreMotion;
#endif

Unlike a regular if statement, an #if is performed by the preprocessor. TARGET_OS_IPHONE returns TRUE if the current target is iOS.

Note: If you are plan to use an #if statement to check if the target OS is Mac OS X then the preferred method to check this is !TARGET_OS_IPHONE.

TARGET_OS_MAC seems to work — but the problem is that it also returns TRUE for iOS.

This might seem odd, but Apple uses !TARGET_OS_IPHONE in their example projects that contain multiple targets, so if it is a glitch, it’s most likely one they don’t plan to fix.

Now you will need to find the remaining code related to CoreMotion and surround it with #if statements.

Find the following code in the instance variables for MyScene.m:

CMMotionManager *_motionManager;

…and replace it with:

#if TARGET_OS_IPHONE
  CMMotionManager *_motionManager;
#endif

Scroll down to the init method and find the following code:

_motionManager = [[CMMotionManager alloc] init];
_motionManager.accelerometerUpdateInterval = 0.05;
[_motionManager startAccelerometerUpdates];

Replace the above code with the following:

#if TARGET_OS_IPHONE
_motionManager = [[CMMotionManager alloc] init];
_motionManager.accelerometerUpdateInterval = 0.05;
[_motionManager startAccelerometerUpdates];
#endif

Now, find the following code [TODO: FPE: Again, are we in the same file?]:

[self updatePlayerWithTimeSinceLastUpdate:timeSinceLast];

…and replace it with the following:

#if TARGET_OS_IPHONE
[self updatePlayerWithTimeSinceLastUpdate:timeSinceLast];
#endif

Last but not least, locate the method named updatePlayerWithTimeSinceLastUpdate: and wrap the entire method with the following code:

#if TARGET_OS_IPHONE
- (void)updatePlayerWithTimeSinceLastUpdate:(CFTimeInterval)timeSinceLast 
.
.
.
}
#endif

If you build this against an iOS target, all of the above #if statements will return TRUE, so the app compiles just as it did before. In contrast, if you build this against a Mac OS X target all of the above #if statements will return FALSE and none of the above blocks will be compiled.

Take a look at the touchesEnded:withEvents: method in MyScene.m. Since the current version of Mac OS X doesn’t support touch screens, this method is meaningless. The Mac OS X version of this game will use mouse clicks instead as a perfectly adequate substitute for screen touches.

To avoid adding a bunch of boilerplate code, you’ll now create a class that inherits from SKScene to help you handle both screen touches and mouse clicks!

Adding Event Handlers

Select your SharedResources group.

On the menu bar select File \ New \ File… as shown below:

Screen Shot 2014-04-14 at 12.19.00 am

Select Objective-C Class from either the iOS or OS X category and click Next.

Screen Shot 2014-04-09 at 3.16.00 pm

Name the file SKMScene and make it a subclass of SKScene.

Screen Shot 2014-04-09 at 3.19.05 pm

Place the file directly under your project folder, make sure both iOS and Mac targets are ticked in the target area and click Create.

Screen Shot 2014-04-09 at 3.20.23 pm

Open SKMScene.h and replace its contents with the following code:

@import SpriteKit;
 
@interface SKMScene : SKScene
 
//Screen Interactions
-(void)screenInteractionStartedAtLocation:(CGPoint)location;
-(void)screenInteractionEndedAtLocation:(CGPoint)location;
 
@end

You’ll override the above two screen interaction methods using subclasses of SKMScene.

Add the following code to SKMScene.m directly under the line @implementation SKMScene:

#if TARGET_OS_IPHONE
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
  UITouch *touch = [touches anyObject];
  CGPoint positionInScene = [touch locationInNode:self];
  [self screenInteractionStartedAtLocation:positionInScene];
}
 
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
  UITouch *touch = [touches anyObject];
  CGPoint positionInScene = [touch locationInNode:self];
  [self screenInteractionEndedAtLocation:positionInScene];
}
 
- (void)touchesCancelled:(NSSet *)touches
               withEvent:(UIEvent *)event
{
  UITouch *touch = [touches anyObject];
  CGPoint positionInScene = [touch locationInNode:self];
  [self screenInteractionEndedAtLocation:positionInScene];
}
#else
-(void)mouseDown:(NSEvent *)theEvent {
  CGPoint positionInScene = [theEvent locationInNode:self];
  [self screenInteractionStartedAtLocation:positionInScene];
}
 
- (void)mouseUp:(NSEvent *)theEvent
{
  CGPoint positionInScene = [theEvent locationInNode:self];
  [self screenInteractionEndedAtLocation:positionInScene];
}
 
- (void)mouseExited:(NSEvent *)theEvent
{
  CGPoint positionInScene = [theEvent locationInNode:self];
  [self screenInteractionEndedAtLocation:positionInScene];
}
#endif
 
-(void)screenInteractionStartedAtLocation:(CGPoint)location {
  /* Overridden by Subclass */
}
 
-(void)screenInteractionEndedAtLocation:(CGPoint)location {
  /* Overridden by Subclass */
}

That’s a fair bit of code, but if you read through from the top, it makes a lot of sense. Touching the screen or the end of a touch event calls the methods in the #if TARGET_OS_IPHONE block. You then create a CGPoint containing the pixel location of the touch and calla the relevant screenInteraction method.

Pressing a mouse button or releasing the mouse button calls the methods in the #else section. Similar to above, you create a CGPoint containing the pixel location of the touch and call the relevant screenInteraction method.

The advantage of using this subclass is that both touch and click events call a screenInteraction method. The screenInteraction methods have no code as you’ll override these in your subclass.

Open MyScene.h and add the following class declaration just under #import :

#import "SKMScene.h"

Next, update the superclass in the @interface line to SKMScene as shown below:

@interface MyScene : SKMScene

This ensures your game scene inherits from your SKMScene subclass. You can now substitute your subclasses for the touch events.

In MyScene.m find the following line:

-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {

…and replace it with the following:

-(void)screenInteractionEndedAtLocation:(CGPoint)location {

Next, delete the following lines from the method as you won’t need them any longer:

UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInNode:self];

Build and run your project using the Mac OS X target; it should compile and run without too many issues:

Screen Shot 2014-04-09 at 3.50.23 pm

Congratulations — you’re now running your Sprite Kit game on your Mac! You’ll notice that it has a few bugs:

  1. Some Macs have a noticeable pause the first time you click on the screen.
  2. The particle effect appears to be a little broken on some Macs.
  3. Not everything is sized correctly on the screen.
  4. The background music has gone noticeably silent.

You’ll tackle each of these bugs in turn — and you’ll learn about a few of the common bugs you’ll encounter when you convert apps between platforms.

Correcting Pre-loading Issues

This bug may not raise its head on all systems, but when it does it definitely points to a performance issue. “First-time-through” bugs like this usually stem from the initial load of resources into memory.

Texture atlases are the first resource type that springs to mind, but since this app doesn’t contain animations or large complex images, it’s safe to assume the problem is somewhere else.

The sound effects are the next most likely candidate as the sound files don’t get loaded until the user clicks on the screen.

To fix this, add the following instance variable to MyScene.m:

SKAction *_playPewPew;

Next, add the following line to initWithSize: inside the if statement:

_playPewPew = [SKAction playSoundFileNamed:@"pew-pew-lei.caf" waitForCompletion:NO];

This modifies your app to preload the sound file when the scene initializes.

Find the following line in screenInteractionEndedAtLocation::

[self runAction:[SKAction playSoundFileNamed:@"pew-pew-lei.caf" waitForCompletion:NO]];

…and replace it with the following:

[self runAction:_playPewPew];

Build and run your app; click the mouse and ensure that the delay has been eradicated.

If your system didn’t expose this bug, then at least the changes above will ensure that it won’t happen on someone else’s system.

Correcting SKS Issues

2014-04-14 21_44_29

At the time of this writing the bug in the particle effect seems to be an issue with Xcode 5. You’ll override the file reference to the texture in your sks file.

Technically, there isn’t anything wrong with your sks file – and you won’t experience this issue on all systems – but you should fix it nonetheless.

Find the following line in projectile:dideCollideWithMonster: of MyScene.m:

SKEmitterNode *emitter = [NSKeyedUnarchiver unarchiveObjectWithFile:[[NSBundle mainBundle] pathForResource:@"SmallExplosion" ofType:@"sks"]];

Add the following code directly under the line you found above:

emitter.particleTexture = [SKTexture textureWithImageNamed:@"spark"];

All you have done above is to tell Xcode where to find the particular texture.

Build and run your app; now you can admire your epic glitch-free particle effect.

Correcting Image Resizing Issues

Navigate to your SpriteKitSimpleGameMac group and then to AppDelegate.m. Take a look at the screen size set in applicationDidFinishLaunching:.

It’s set to 1024 x 768 — this is the resolution of the non-Retina iPad.

Now take a look at the contents of sprites.atlas. As expected, all iPad versions of images are suffixed with ~ipad so that your app knows to use these images when it runs on an iPad.

Unfortunately, there is no ~mac suffix you can use here; instead, you’ll need to create a separate texture atlas for the Mac version of your app.

In order to keep your build as small as possible, you should use a texture atlas with only the resolutions your app will actually use.

Right-click on sprites.atlas and select Show in Finder to take you to the images folder.

Create a copy of sprites.atlas and delete all images from the copied folder that don’t have ~ipad in their name.

Screen Shot 2014-04-14 at 9.56.44 pm

Next, remove the ~ipad designator from the file names but leave the @2x designator intact.

Note: The @2x files have been left in the project to support the Retina screens on the Macbook Pro.

Rename the folder to spritesMac.atlas and drag the renamed folder into your project.

In the Choose options for adding these files dialog, make sure only the SpriteKitSimpleGameMac target is ticked in the Add to targets section as shown below:

Screen Shot 2014-04-09 at 11.17.41 pm

Click Finish. Now that the folder has been imported, select sprites.atlas and turn off target membership for Macs. This ensures that each texture atlas works separately of the other.

Keeping with the spirit of staying organized, move the iOS texture atlas into the iOS group and the Mac texture atlas into the Mac group, as shown below:

Screen Shot 2014-04-09 at 11.25.38 pm

Next, go to Project\Clean. This will remove any old files from your build directory (if you forget to do this it might not work, as sprites.atlas may still exist).

Build and run your app; you should see that everything loads at the proper size, as shown below:

Screen Shot 2014-04-09 at 11.52.43 pm

At this point your app supports iPhone, iPad and Mac OS X resolutions — and Retina-compatible to boot.

Correcting Soundtrack Issues

Finally, you’ll need to deal with the missing soundtrack to your game.

Look at ViewController.m in the SpriteKitSimpleGame group. viewWillLayoutSubviews has a small section of code which instantiates AVAudioPlayer and sets it to repeat forever.

NSError *error;
NSURL *backgroundMusicURL = [[NSBundle mainBundle] URLForResource:@"background-music-aac" withExtension:@"caf"];
self.backgroundMusicPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:backgroundMusicURL error:&error];
self.backgroundMusicPlayer.numberOfLoops = -1;
[self.backgroundMusicPlayer prepareToPlay];
[self.backgroundMusicPlayer play];

Aha — you don’t have a ViewController in Mac OS X. Therefore, you’ll need to call this code from AppDelegate instead.

Find the following line in AppDelegate.m of the SpriteKitSimpleGameMac group:

@implementation AppDelegate

..and replace it with the following:

@import AVFoundation;
 
@implementation AppDelegate {
    AVAudioPlayer *backgroundMusicPlayer;
}

Next, add the following code to the top of applicationDidFinishLaunching::

NSError *error;
NSURL * backgroundMusicURL = [[NSBundle mainBundle] URLForResource:@"background-music-aac" withExtension:@"caf"];
backgroundMusicPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:backgroundMusicURL error:&error];
backgroundMusicPlayer.numberOfLoops = -1;
[backgroundMusicPlayer prepareToPlay];
[backgroundMusicPlayer play];

Build and run your app; the music plays on!

You’ve resolved all of the bugs from the Mac conversion, but you still haven’t solved the issue of game controls in the Mac version of the game.

Using the Keyboard

The ninja’s movements in the iOS version of the app are controlled by tilting the device. These movements are processed by CoreMotion, and the game loop calls updatePlayerWithTimeSinceLastUpdate: to calculate the new player location for the current frame.

Responding to key presses requires a slightly different approach using the available methods to listen for keypress events.

Add the following code to updatePlayerWithTimeSinceLastUpdate: in MyScene.m just before the #endif statement:

#else
-(void)keyDown:(NSEvent *)theEvent {
 
 
}

This updates to the method to respond to a keypress as well. Note that there’s a corresponding keyUp to handle the release of the key to handle events that only last for the duration of the keypress.

You don’t want to respond to just any keypress; you can find out which key was pressed using the passed-in NSEvent.

Add the following code between the curly braces of the keyDown: method you just added:

-(void)keyDown:(NSEvent *)theEvent {
  NSString *keyPressed = [theEvent charactersIgnoringModifiers];
  if ([keyPressed length] == 1) {
      NSLog(@"Key: %c",[keyPressed characterAtIndex:0]);
  }
}

Here you extract the pressed characters from the event without any modifier keys. This means key combinations like Command + S will be ignored. As well, you check that the keypress is only one character in length to filter out any other unwanted events. You’ll dump the key pressed out to the console.

Build and run your project; press a few keys while the game is running and you’ll see the keys pressed show up in the debug area, similar to the following example:

Screen Shot 2014-04-10 at 4.54.04 pm

Since you’ll use the up and down arrow keys to move your player sprite, press those keys and see what you get in the console:

Screen Shot 2014-04-10 at 5.01.55 pm

Hmm, that looks a little odd. The arrow keys are part of the group known as function keys, so they don’t have a proper character representation. But don’t fret: there’s an easy way to detect when function keys are pressed.

NSEvent is your best friend when it comes to managing keyboard and mouse inputs on the Mac. This tutorial merely introduces NSEvent; it’s highly recommended that you check out the full NSEvent class reference.

For now, take a quick look at the section of NSEvent documentation that deals with the function keys enum. The keys you’re concerned with are NSUpArrowFunctionKey and NSDownArrowFunctionKey.

Go back to MyScene.m and find the keyDown: method you just added.

Comment out the NSLog statement and paste the following code immediately below that:

unichar charPressed = [keyPressed characterAtIndex:0];
switch (charPressed) {
    case NSUpArrowFunctionKey:
        [_player runAction:[SKAction moveByX:0.0f y:50.0f duration:0.3]];
        break;
    case NSDownArrowFunctionKey:
        [_player runAction:[SKAction moveByX:0.0f y:-50.0f duration:0.3]];
        break;
    default:
        break;
}

Here you store the pressed character as Unicode character and compare it to the up and down function keys. You then use an SKAction to move the character up and down accordingly.

Build and run your project; press the up and down arrow keys and you should see your character moving up and down the screen like so:

2014-04-10 17_33_56

You’ve spent all this time modifying the game to be played on a Mac, but you still need to check that you haven’t affected any portion of the iOS version of the game!

Build and run your project using the iOS target and play around with it a bit to make sure none of the game functions have been affected by your changes.

Where to Go From Here?

You can grab the completed sample files for this project from here.

Now that you have a good understanding of the work required to convert your iOS targeted project to a iOS / Mac targeted project, you’ll definitely want to create any new Sprite Kit games with cross-platform capabilities right from the start. I have made a cross-platform Sprite Kit project starter template on Github that might be useful – you can clone and use for your own games.

To learn more about making games that work on both iOS and OS X (especially related to scene sizes, image sizes, and aspect ratios, and UIKit vs. Cocoa Touch considerations), check out the upcoming second edition of our book iOS Games by Tutorials, where we go into more detail.

If you have any comments or questions, feel free to join the discussion below!

How to Port Your Sprite Kit Game from iOS to OS X is a post from: Ray Wenderlich

The post How to Port Your Sprite Kit Game from iOS to OS X appeared first on Ray Wenderlich.

Basic UIView Animation with Swift Tutorial

$
0
0
Use animation to see what's inside this picnic basket!

Use animation to see what's inside this picnic basket!

Update note: This tutorial was updated for iOS 8 and Swift by Bjørn Ruud. Original post by Malek Trabelsi.

One of the coolest things about iOS apps is how animated many of them are. You can have views fly across the screen, fade in or fade out, rotate around, and much more!

Not only does this look cool, but animations are good indicators that something is going on that a user should pay attention to, such as more information becoming available.

The best part about animation in iOS is that it is incredibly easy to implement programmatically! It’s literally just a couple lines of code and you’re up and running.

In this tutorial you’ll get a chance to go hands-on with UIView animation to create a simple app about going on a picnic. The picnic basket opens in a neat animated way, and then you get to look what’s inside – and take decisive action!

In the process you will learn how to use the basic UIView animation APIs, and how to chain animations.

So grab your picnic basket and let’s get started!

Note: At publication time, our understanding is we cannot post screenshots of beta products. Therefore, we are suppressing Xcode 6 and iOS 8 screenshots in this tutorial until we are sure it is OK.

Getting Started

Just so you can appreciate how nice and easy UIView animation is, bear in mind that you would need to perform a few steps in order to animate a view moving across the screen if iOS didn’t provide you with built-in animation support:

  • Schedule a method to be called in your app, for every frame.
  • Every frame, calculate the new position of the view (x and y) based on the desired final destination, the time to run the animation, and the time run so far.
  • Update the view’s position to the calculated position by running the desired animation.

That’s not a ton of work, but it might make you think twice about implementing an animation system yourself. Plus, it gets a lot more complicated to keep track of the more animations you do.

But don’t worry – animations are extremely easy to use in UIKit! There are certain properties on views, such as the view’s frame (its size and position), alpha (transparency), and transform (scale/rotation/etc.) which have built-in animation support when they are modified within an animation block. Instead of having to do all of the manual animation steps above, you simply:

  • Set up an animation block, specifying how long it should run and a few other optional parameters.
  • Set an animatable property on a view within the block, such as its frame.

That’s it – UIKit will take over and handle the calculations and updates for you!

Let’s dive in and see what this looks like in code by creating an animation of a picnic basket opening when you start up the app.

An Opening Picnic Basket

Open up Xcode and select File \ New Project. Choose iOS \ Application \ Single View Application and click Next. Name the project Picnic, select Swift for language and iPhone for device. Click Next, choose a place to save your project and click Create.

Next, download a copy of some images and sounds made by Vicki that you’ll need for this project. In Xcode, select the Images.xcassets asset catalog. Unzip the downloaded archive and drag all files in the Images folder into the asset catalog view.

From the project navigator, click on Main.storyboard to open it in the editor. You will notice that the view shown is rectangular instead of a size that fits a device. In Xcode 6 storyboards will by default use size classes, which means the same storyboard can be used for all target devices regardless of screen size. That’s a pretty nice feature, but unnecessary for this project so let’s keep it simple.

With the storyboard selected, open the File inspector in the right hand pane. In the Interface Builder Document section there is a checkbox marked Use Size Classes. Uncheck it, make sure iPhone is selected as the new target in the dialog, then click Disable Size Classes.

You also need to disable autolayout by unchecking the checkbox marked Use Auto Layout. For simple layouts like this auto layout can make things more complex than necessary, and it is also beyond the scope of this tutorial. To learn about auto layout take a look at Beginning Auto Layout Tutorial in iOS 7.

Now you need to set up the user interface. You will start with the basket doors that will open to reveal the contents of the basket. First drag a View to the main view and resize it to completely fill its parent view. This view will serve as a container for the basket doors. Go to the Size inspector and uncheck all springs and struts so that the view “floats” in the center of the parent view.

Drag two UIImageViews to the container view, one on top filling up about half the space, and one on the bottom filling up the bottom half. Select the top image and open the attributes inspector. Set the image to door_top and set View Mode to Bottom. Set the bottom image to door_bottom, set View Mode to Top. Resize the image views until they look OK, as shown below.

Align the top and bottom parts of the basket.

Align the top and bottom parts of the basket.

By putting the image views in a container centered in the view the basket doors will always be centered on the screen regardless of screen size.

It’s time for some code. You need to declare properties for the two new image views in order to animate them later. Open the assistant editor and make sure ViewController.swift is shown. Ctrl-drag the top image view into the assistant editor, just below the class declaration, and name the outlet basketTop. Do the same for the bottom image view and name it basketBottom. The result should look like this:

class ViewController: UIViewController {
  @IBOutlet var basketTop: UIImageView
  @IBOutlet var basketBottom: UIImageView
  //...
}

Now that you have properties referencing the image views you created in Interface Builder you can animate them to open the basket when the view first appears. Switch back to ViewController.swift and add the following method below viewDidLoad():

override func viewDidAppear(animated: Bool) {
  UIView.animateWithDuration(0.7, delay: 1.0, options: .CurveEaseOut, animations: {
    var basketTopFrame = self.basketTop.frame
    basketTopFrame.origin.y -= basketTopFrame.size.height
 
    var basketBottomFrame = self.basketBottom.frame
    basketBottomFrame.origin.y += basketBottomFrame.size.height
 
    self.basketTop.frame = basketTopFrame
    self.basketBottom.frame = basketBottomFrame
  }, completion: { finished in
    println("Basket doors opened!")
  })
}

The animateWithDuration(delay:, options:, animations:, completion:) call defines an animation with a duration of half a second that starts after a delay of one second. The animation is set to “ease out”, which means the speed slows down at the end.

The animations block defines what the animation will do. Here, the frames of the two image views are set to their final destinations: the top basket image moves up and the bottom one moves down to “open” the basket. Since you set the duration, UIKit takes over from there and runs a neat animation of your basket opening up.

The completion block runs after the animation completes or is interrupted – the finished parameter is a Boolean that will let you know if the animation finished or not.

Build and run the code to try it out for yourself – pretty easy to get such a neat effect, eh?

A Second Layer of Fun

When you go out on a picnic you usually don’t just throw your food straight into the basket – instead you put a little napkin on top to shield the food from pesky infiltrators. So why not have a little more fun with animations and add a napkin layer to open as well?

Go back to Main.storyboard, and select the two image views you worked with so far. With the image views selected, go to the Edit menu and select Duplicate. Position these duplicate views so they overlap with the originals.

From the Attributes inspector, set the new top view’s image to fabric_top and the new bottom view’s image to fabric_bottom.

You want the fabric views to be underneath the picnic basket. Open the document outline if it isn’t open already by navigating to Editor \ Show Document Outline. Since views are listed from bottom to top you want the napkin to be below the basket. Drag the views around in the outline so the order is as follows:

  • fabric_top
  • fabric_bottom
  • door_top
  • door_bottom

Now that you have the new image views in your scene, can you figure out how to animate this yourself based on everything you’ve learned? The goal is to make the napkin move off screen also, but start moving slightly after the picnic basket starts moving. Go ahead – you can always check back here if you get stuck.

The Solution

In case you had any troubles, here’s the solution.

First add two new outlet properties to ViewController.swift by ctrl-dragging from the storyboard to the assistant editor as you did for the basket outlet properties:

@IBOutlet var fabricTop: UIImageView
@IBOutlet var fabricBottom: UIImageView

Add the following to the bottom of viewDidAppear():

UIView.animateWithDuration(1.0, delay: 1.2, options: .CurveEaseOut, animations: {
  var fabricTopFrame = self.fabricTop.frame
  fabricTopFrame.origin.y -= fabricTopFrame.size.height
 
  var fabricBottomFrame = self.fabricBottom.frame
  fabricBottomFrame.origin.y += fabricBottomFrame.size.height
 
  self.fabricTop.frame = fabricTopFrame
  self.fabricBottom.frame = fabricBottomFrame
}, completion: { finished in
  println("Napkins opened!")
})

No big changes compared to the basket door animation you already worked with. Note that there are slightly larger values to the duration and delay parameters so they start and finish after the basket animation.

Build and run your code, and you should see the basket open in an even cooler manner!

How To Chain Animations

So far you’ve just been animating a single property on these UIViews – the frame. Also, you’ve done just a single animation, and then you were done.

However as mentioned earlier in this article, there are several other properties you can animate as well, and you can also trigger more animations to run after one animation completes. So let’s try this out by experimenting with animating two more interesting properties (center and transform) and using some animation chaining!

But first – let’s add the inside of the picnic basket! Open up Main.storyboard and drag yet another image view to the view container. Resize and position it so it fills the view, and make sure it’s at the top of the list so it’s shown underneath everything else. Set the image to plate_cheese.

There’s one more thing you have to add. Somehow, despite all of your precautions, a sneaky bug has made its way into the basket! Add another UIImageView as a subview of the container view. Put it right below the Plate view, and set the image to bug. Set its frame to x: 160, y: 185, width: 129, height: 135 in the Size inspector.

At this point, your View Controller Scene should look like the following in the document outline:

  • plate_cheese
  • bug
  • fabric_top
  • fabric_bottom
  • door_top
  • door_bottom

Next add an outlet for the new pest image view in ViewController.swift by ctrl-dragging from the the storyboard to the assistant editor, like you did for the other outlets. Call the outlet bug.

Eventually, you’ll get to squash the bug invading the picnic. Switch to ViewController.swift and add the following property inside the class declaration:

var isBugDead = false

This will be used later to determine whether the bug is dead or not.

Next, add the following methods:

func moveBugLeft() {
  if isBugDead { return }
 
  UIView.animateWithDuration(1.0,
    delay: 2.0,
    options: .CurveEaseInOut | .AllowUserInteraction,
    animations: {
      self.bug.center = CGPoint(x: 75, y: 200)
    },
    completion: { finished in
      println("Bug moved left!")
      self.faceBugRight()
    })
}
 
func faceBugRight() {
  if isBugDead { return }
 
  UIView.animateWithDuration(1.0,
    delay: 0.0,
    options: .CurveEaseInOut | .AllowUserInteraction,
    animations: {
      self.bug.transform = CGAffineTransformMakeRotation(CGFloat(M_PI))
    },
    completion: { finished in
      println("Bug faced right!")
      self.moveBugRight()
    })
}
 
func moveBugRight() {
  if isBugDead { return }
 
  UIView.animateWithDuration(1.0,
    delay: 2.0,
    options: .CurveEaseInOut | .AllowUserInteraction,
    animations: {
      self.bug.center = CGPoint(x: 230, y: 250)
    },
    completion: { finished in
      println("Bug moved right!")
      self.faceBugLeft()
    })
}
 
func faceBugLeft() {
  if isBugDead { return }
 
  UIView.animateWithDuration(1.0,
    delay: 0.0,
    options: .CurveEaseInOut | .AllowUserInteraction,
    animations: {
      self.bug.transform = CGAffineTransformMakeRotation(0.0)
    },
    completion: { finished in
      println("Bug faced left!")
      self.moveBugLeft()
    })
}

You can see that the way animation chaining is done is by starting a new animation in the completion block. The chain moves the bug left, rotates right, moves right, rotates left, and then repeats.

Take note of the .AllowUserInteraction option in the options parameter. This is because you need the bug view to respond to touch events (to get squashed :] ) while it’s moving, so that option allows you to interact with views while they are being animated.

The code moves the bug by modifying the center property rather than the frame property. This sets where the center of the bug image is, which can sometimes be easier to work with than setting the frame.

Bug rotation is done by setting an affine transform. CGAffineTransformMakeRotation is a helper function that creates a rotation transform, and the angle parameter is in radians which is why the parameter value is M_PI to rotate 180 degrees.

Now you just need to start the animation chain. Add the following line to the bottom of viewDidAppear():

moveBugLeft()

Build and run, and you should see the bug scurrying about!

Bug!

Bug!

Squash the Bug!

Now it’s the moment I know you’ve been waiting for – it’s time to squash that bug!

Add the following method to handle the bug-squashing event:

func handleTap(gesture: UITapGestureRecognizer) {
  let tapLocation = gesture.locationInView(bug.superview)
  if bug.layer.presentationLayer().frame.contains(tapLocation) {
    println("Bug tapped!")
    // add bug-squashing code here
  } else {
    println("Bug not tapped!")
  }
}

When responding to a tap, you need to check whether the tap was on the bug. Normally, you would check the tapLocation against the bug view frame but here, notice you’re using the view presentation layer’s frame. This is an important distinction: UIView animations will update a view’s “presentation layer” which represents what is shown on screen rather than the view’s frame itself. Technically, the bug is located exactly where it was positioned and the in-between animation is just a presentation layer change.

To find out more about layers and what this all means you can check out Introduction to CALayers.

Next, add the following lines to the end of viewDidAppear():

let tap = UITapGestureRecognizer(target: self, action: Selector("handleTap:"))
view.addGestureRecognizer(tap)

This code initializes a new tap gesture recognizer and then adds it to the view.

Build and run, and tap on the screen. Depending on whether you hit the bug or not, you’ll see “Bug tapped!” or “Bug not tapped!” in the Xcode console.

The most satisfying part is next. Find handleTap() and add the following code inside the if block where the bug is tapped:

if isBugDead { return }
isBugDead = true
UIView.animateWithDuration(0.7, delay: 0.0, options: .CurveEaseOut, animations: {
  self.bug.transform = CGAffineTransformMakeScale(1.25, 0.75)
}, completion: { finished in
  UIView.animateWithDuration(2.0, delay: 2.0, options: nil, animations: {
    self.bug.alpha = 0.0
  }, completion: { finished in
    self.bug.removeFromSuperview()
  })
})

Once the bug is tapped, you first set isBugDead to true so the animation chain stops running. Then you can start a new chain of animations: first the bug is squished by applying a scale transform, and then it fades away by setting the alpha to 0 after a delay. After the fade the bug is removed from the superview.

Build and run, and now you should be able to squash the bug!

Squashed!

Squashed!

Gratuitous Sound Effect

This is totally unnecessary, but also totally fun – you’ll add a gratuitous bug squashing sound effect!

First drag the Sounds folder from the asset archive into the project, and make sure to select that files should be copied and groups created.

Back in ViewController.swift, add this import to the top of the file:

import AVFoundation

AVFoundation includes audio playback, so you need that import for maximum bug-squashing effect.

Next, add a property to the class declaration:

let squishPlayer: AVAudioPlayer

This will hold the audio player instance with your sound file.

Next, add the following initializer method:

init(coder aDecoder: NSCoder!) {
  let squishPath = NSBundle.mainBundle().pathForResource("squish", ofType: "caf")
  let squishURL = NSURL(fileURLWithPath: squishPath)
  squishPlayer = AVAudioPlayer(contentsOfURL: squishURL, error: nil)
  squishPlayer.prepareToPlay()
 
  super.init(coder: aDecoder)
}

This code ensures the AVAudioPlayer instance is set up when the view controller is initialized. If you know Objective-C it might seem strange to call super.init() at the end of the initalizer. In Swift an initalizer must always ensure all instance variables are initalized before the superclass initializer is called, and after that it’s safe to call instance and superclass methods. Note that instance variables with default values (like isBugDead) do not need to be set in a designated initializer.

Finally, add the following line to handleTap() just after you set isBugDead to true:

squishPlayer.play()

That will play the sound at just the right moment. And that’s it! Build and run your code, make sure the volume is turned up, and you can now squash the bug and get major audio satisfaction in the process.

Where To Go From Here?

Here is the final project with all of the code for this tutorial.

You can think of additional animations based on what you learned so far. For example, how about closing the picnic basket after squashing the bug (if you felt disgusted like me once you see the bug playing around the food ;] ). You can also change when to open up the basket doors and napkins, so that it will remain closed until you touch it.

Throughout this tutorial you only used the animateWithDuration(duration, delay, options, animations, completion) function, but there are also other animation functions available:

  1. animateWithDuration(duration, animations)
  2. animateWithDuration(duration, animations, completion)
  3. animateKeyframesWithDuration(duration, delay, options, animations, completion)
  4. performSystemAnimation(animation, onViews, options, animations, completion)
  5. animateWithDuration(duration, delay, usingSpringWithDamping, initialSpringVelocity, options, animations, completion)

The first two are the same as the animation method you have used in this tutorial, except without delay and options parameters. The third is for keyframe-based animation which enables almost total control over how things are animated. The fourth is for running a system provided animation. The last one creates an animation using a spring-based motion curve. In other words, it bounces towards the end.

Now that you know the basics of using UIView animations you might want to take a look at the Animations section in the View Programming Guide for iOS for additional useful info.

How do you use UIView animation or Core Animation in your projects? I’d love to hear from you in the forum discussion below!

Basic UIView Animation with Swift Tutorial is a post from: Ray Wenderlich

The post Basic UIView Animation with Swift Tutorial appeared first on Ray Wenderlich.


IRC for iOS Developers

$
0
0
Chat with fellow iOS devs on IRC!

Chat with fellow iOS devs on IRC!

I’ve been working from home for over three years now, and while I absolutely love it, one of the things I miss the most about working in an office is camaraderie you have with fellow developers there.

The good news is that in the past year or so, I’ve found my fix with an online alternative: IRC!

IRC is an internet chat protocol that has been around since the beginning of the Internet. You can connect to IRC servers to chat about any subject imaginable – including iOS development, OS X development, and even Swift development.

I believe IRC is a great way to get to know fellow iOS developers, to get help with questions, and to help out others.

That’s why I’m writing this tutorial! This tutorial will help get you started with:

Let’s get chatting!

Note: Special thanks to Matthijs Hollemans and Nimesh Neema for their assistance with some parts of this tutorial!

Choosing an IRC Client and Getting Started

The first step is to choose and download and install an OS X IRC client, and then follow some instructions I’ve provided to connect to a chat room. Here are some of the most popular options:

Again – download and install the client of your choice, and then jump to the appropriate instructions below!

Getting Started: Colloquy

Connecting to an IRC server

Start up Colloquy and go to File\New Connection. For Nickname enter your preferred nickname, for Chat Server enter irc.freenode.net, and click Connect:

001_Colloquy

Back in your list of connections, after a few moments you should see a lightning bolt icon appear – this indicates you are connected. Note that you can always double click a connection to connect.

Registering your nickname

Click the Console button to reveal a connection to the IRC server itself. This will allow you to send some commands to register your nickname, which is a prerequisite to connecting to some of the iOS development channels.

Enter the following command down in the text field at the bottom of the screen and hit enter:

/msg NickServ REGISTER password youremail@example.com

After a few moments, you should see a reply from NickServ letting you know that it has sent you an email:

004_Colloquy

Check your email and enter the command that it tells you in the text field and hit enter to continue. You should see a success message from NickServ.

Back in your Connections list, right click your connection and choose Get Info. Enter the password you set in the password field:

005_Colloquy

Right click on the connection, and choose Disconnect. Then double click to connect again. If you still have your console open, you will see an “authentication successful” message – this means your nickname and password is registered!

Joining a channel

Now for the fun part – joining a chat channel for iOS developers. Click the Join Room button in your Connections window:

006_Colloquy

Make sure the Connection is set to irc.freenode.net, for the Chat Room enter cocoa-init, and click Join:

007_Colloquy

And you’re in! You can use the text field at the bottom to chat.

At this point, feel free to skip ahead to the IRC Channels for iOS Developers section to find out about more channels you can join!

Getting Started: Adium

Connecting to an IRC server

Start up Adium. If the Setup Assistant appears, click the x button to dismiss it.

Then go to File\Add Account\IRC (Internet Relay Chat). For Nickname enter your preferred nickname, for Hostname enter irc.freenode.net, and click OK:

008_Adium

After a few moments, the green icon next to your name should light up to indicate that you are online. Note that you can always use the dropdown to switch your status to available to connect.

Registering your nickname

Go to File\New Chat, make sure that From is set to , set To to NickServ, and click Message. This will allow you to send some commands to register your nickname, which is a prerequisite to connecting to some of the iOS development channels.

009_Adium

Enter the following command down in the text field at the bottom of the screen and hit enter:

REGISTER password youremail@example.com

After a few moments, you should see a reply from NickServ letting you know that it has sent you an email:

010_Adium

Check your email and enter the command that it tells you in the text field (without the /msg NickServ part) and hit enter to continue. You should see a success message from NickServ.

Close the NickServ window. In the Contacts window, choose the dropdown next to Available and set it to Offline to disconnect. Then set it back to Available to reconnect.

After a few moments, NickServ will ask you for your password, so enter the password you set in the password field:

011_Adium

If you don’t see any errors – this means your nickname and password is registered!

Joining a channel

Now for the fun part – joining a chat channel for iOS developers. Go to File\Join Group Chat…, make sure the Account is set to irc.freenode.net, for Channel enter #cocoa-init, and click Join:

012_Adium

And you’re in! You can use the text field at the bottom to chat.

At this point, feel free to skip ahead to the IRC Channels for iOS Developers section to find out about more channels you can join!

Getting Started: Irssi

Connecting to an IRC server

Irssi is different than the other options so far in that everything is on the command line!

Start up Irssi and you’ll see the following:

014_Irssi

Enter these commands to connect to Freenode:

/set nick yournickname
/network add -whois 1 -msgs 4 -kicks 1 -modes 4 freenode
/server add -auto -network freenode irc.freenode.net 6667
/connect freenode

After a few moments you should see some welcome messages from Freenode – this indicates you are connected.

015_Irssi

Registering your nickname

Next you need to send some commands to NickServ to register your nickname, which is a prerequisite to connecting to some of the iOS development channels.

Enter the following command down in the text field at the bottom of the screen and hit enter:

/msg NickServ REGISTER password youremail@example.com

This causes irssi to open a new window – use Command-P to switch to it.

After a few moments, you should see a reply from NickServ letting you know that it has sent you an email in the new window.

Check your email and enter the command that it tells you in the text field (but without the /msg NickServ part) and hit enter to continue. You should see a success message from NickServ.

Hit Command-P to go back to the main window. Enter this command to auto-register with NickServ when you connect from now on:

/network add -autosendcmd "/^msg nickserv identify password;wait 2000" freenode
/save
/quit

Restart irssi, and verify that you automatically connect and register your nickname.

Joining a channel

Now for the fun part – joining a chat channel for iOS developers. Simply enter the following command:

/join #cocoa-init

You will see a list of users in the channel, and you can use the text field at the bottom to chat.

013_Irssi

And you’re in! You can use the text field at the bottom to chat. For more information, check out the Irssi documentation.

At this point, feel free to skip ahead to the IRC Channels for iOS Developers section to find out about more channels you can join!

Getting Started: Textual

Connecting to an IRC server

Start up Textual, click the + button in the lower left, and select Add Server:

016_Textual

For Network Name enter Freenode and for Server Address enter irc.freenode.net:

017_Textual

Switch to the Identity tab, for Nickname enter your preferred nickname, and click Save:

018_Textual

Back in the main window, double click the Freenode entry to connect. You should see a message from the server – this indicates you are connected.

019_Textual

Registering your nickname

Next you need to send some commands to register your nickname, which is a prerequisite to connecting to some of the iOS development channels.

Enter the following command down in the text field at the bottom of the screen and hit enter:

/msg NickServ REGISTER password youremail@example.com

After a few moments, you should see a reply from NickServ letting you know that it has sent you an email:

020_Textual

Check your email and enter the command that it tells you in the text field (without the /msg NickServ part) and hit enter to continue. You should see a success message from NickServ.

Back on the sidebar, right click your Freenode connection and choose Server Properties. In the Identity tab, enter the password you set in the Personal Password field:

021_Textual

Right click on the freenode connection, and choose Disconnect. Then right click and choose Connect to connect again. If you don’t get any errors, this means you’re connected and authenticated successfully!

Joining a channel

Now for the fun part – joining a chat channel for iOS developers. Right click the Freenode entry and choose Join Channel. For Channel enter #cocoa-init, and click Save:

022_Textual

And you’re in! You can use the text field at the bottom to chat.

At this point, feel free to skip ahead to the next section to find out about more channels you can join!

Getting Started: IRCCloud

Note: Some IRC channels ban web-based clients like IRCCloud. You may prefer to use one of the other clients to avoid this.

Connecting to an IRC server

Go to irccloud.com and register for a free account. Once you have signed up, you will be automatically directed to Join a new network screen.

IRCCloud_Homepage

Under hostname enter irc.freenode.net. For Nickname, enter you preferred nickname. Leave other values to defaults and click Join network button.

IRCCloud_JoinNetwork

Registering your nickname

You will need to register your nickname with the server before you can start chatting. Click on freenode shown towards the right side of window to reveal the server console. Here you can send commands to register your nickname, which is required to connect to some of the iOS development channels.

In the text field shown at the bottom of the screen, enter the following command

/msg NickServ REGISTER password youremail@example.com

IRCCloud_RegisterNickname

After a few moments, you should see a reply from NickServ, letting you know that it has sent you an email:

Check your email and enter the command that it tells you in the text field and hit enter to continue. You will see a successfully verified message from NickServ.

IRCCloud_VerifyRegistration

Now click on freenode towards the right side to select the server and click on the Identify Nickname button. Once you are identified succesfully, you are good to join channels.

IRCCloud_IdentifyNickname

Joining a channel

In the text field shown below, enter the following command.

/join #cocoa-init

You will soon be redirected to the #cocoa-init channel screen. You can use the text field at the bottom of the screen to start chatting.

IRCCloud_Chatting

At this point, feel free to skip ahead to the IRC Channels for iOS Developers section to find out about more channels you can join!

IRC Channels for iOS Developers

Now that you’ve successfully connected to IRC, you may be wondering what some good channels are to join. Here are our recommendations:

  • #cocoa-init: This is the channel you connected to in the tutorial. It’s actually a brand new channel, oriented to new developers (and beginner questions) in particular. It’s great if you are either a new Cocoa developer, or if you enjoy helping or meeting newer developers. Kyle Robson, Erica Sadun, and Lyle Andrews are the lead organizers of this channel, and I hang out here from time to time, so stop by and say hi!
  • #swift-lang: Another relatively new channel, focused on the Swift language itself. This channel is particularly active lately and has some nice discussions. Mike Ash hangs out here.
  • #iphonedev: The original and busiest iOS development channel on Freenode. This is the place to go for giving and getting advice on intermediate to advanced topics. Discussions about the official SDK only, no jailbreaking.
  • #iphonedev-chat: This is the sister channel of #iphonedev, for off-topic discussions. Sometimes it’s fun to talk about things other than apps, and this the place to go. It’s great for those water cooler conversations — get your gossip here!
  • #macdev: All the cool kids are doing iOS these days but if you’re old school and make OS X apps, then this is the channel to find likeminded developers. It’s not as busy as the iPhone channels but the regulars here are very knowledgeable.
  • #iphone: For chatting about everything related to the iPhone. This is also a good place to go for jailbreaking questions.

IRC Etiquette

There are a few areas of IRC Etiquette that you should keep in mind.

First, it’s cool to ask questions on IRC, but if you do be sure to try to answer questions and help others as well. Learn the art of asking good questions. If you want to share source code, don’t paste it directly into the channel but use a “pastebin” instead.

Second, note that IRC can be very distracting if you let it. What I personally have found helpful is to simply minimize IRC and ignore it for a while when I get busy or am in the middle of something. Don’t worry, no-one will be insulted if you leave mid-conversation – we all do the same thing :]

Sometimes people who have nothing better to do with their time (usually bored kids) find it funny to troll on IRC. They do this just to get a rise out of people. The best advice is to ignore them. If a troll finds no response, they’ll go away eventually. If the trolling gets really bad, notify one of the channel operators so they can kick the trolls out of the room. Of course, don’t be a troll yourself. ;]

Remember that text — especially in real time chat — lacks the finesse of face-to-face conversation. It’s good to have a thick skin on IRC. It’s easy to get offended — or to offend — and start a flame war, but that spoils the mood for everyone and will get you kicked, or even banned, from the channel. Respect the channel rules.

Tip: Most IRC clients support “tab completion”. So if you want to respond to someone with the nick JonnyAppleseed, just type the first few letters of the nick followed by the tab key, and the IRC app will complete the name for you. Typing “jo<tab>” is a lot quicker than typing the full name.

Be nice, and make friends!

Where To Go From Here?

Enjoy! Remember the whole idea is to have an informal place to chat, help each other out, hang out, and have fun – when you have time to spare and need a “water cooler” moment! :]

Myself and many other IRC fans out there hope to get a chance to talk to you soon!

IRC for iOS Developers is a post from: Ray Wenderlich

The post IRC for iOS Developers appeared first on Ray Wenderlich.

Readers’ App Reviews – July 2014

$
0
0
hipsterface

Fun in the sun with apps!

I thought you’d all go on vacation by now, but I should have known you’d keep making awesome apps instead!

The raywenderlich.com community just can’t stop building new, exciting apps.

This month we’ve got:

  • An app where you can make music by smiling
  • An app to help you track your time
  • An app to make you look like a hipster
  • And games galore!

What are you waiting for? You’ve got a lot of apps to download from your fellow readers!

Beelly

beelly

Did you think Flappy Bird was hard? You haven’t seen Beelly yet :]

Beelly inverts the controls on you every 8 seconds. One second, tilting right will move you right and the next tilting right will move you left. As if that wasn’t enough to bring the pain, the game objective itself is to thread a needle with this cute little bee.

If you dare, steer Beelly throw a winding meadow with pinpoint accurancy and see how far you can get. >:]

Smilophone

smilophone
A truly one of a kind app, Smilophone is an instrument we can all play.

Smilophone creates music based off your facial expressions! Using the camera of your iOS device, Smilophone uses face tracking technology to figure out if you’re smiling, or if you’re raising your eyebrows.

You can smile to make high pitched sounds, Raise your eyebrows to change the tune. Sad faces make sad sounds. Your face is all you need to create a musical work of art.

Sometime

sometime
Most of us as developers are all too aware of the pain that is time tracking. Thanks to Sometime, that may longer be a problem at all.

Sometime makes it easy to track time on multiple projects. Simply setup “buckets” for tracking time. Each bucket can be assigned to a project and client so you can get granular in your tracking. Simply tap a bucket to start tracking real time.

Sometime also lets you see an overview timeline of your work for each day. Add in geofencing and calendar integration and you have an app that can track all the time you need right in your pocket. This is definitely a productivity tool worth checking out.

S3nsitive

sensitive
S3nsitive is a puzzle game with a little more to it than meets the eye. Its simple looking at first, just get from A to B. But theres much more to it than that.

Each step you take eliminates the block beneath you. So you can’t backtrack in this tricky puzzle. And don’t spend too much time thinking, blocks between platforms can only support you for so long.

With 40 levels, a sweet soundtrack, and GameCenter leaderboards, this puzzler is definitely worth checking out.

Flyover – goes around the world

flyover2
Flyover puts you in charge of your own airplane to fly around the world.

With over 260 destinations you choose where your airplane flys. You can upgrade its speed, capacity, and maximum distance with cash you earn flying passengers between cities.

Keep track of fuel costs for maximum efficiency and watch out for bad weather grounding your planes.

Hipster Face Live

hipsterface
Well camera apps come and go but Hipster faces live on forever.

Hipster Face Live definitely got a few laughs from my friends. You can select different stamps like hats, glasses, beards, etc and then see them line up in realtime on your face using facial recognition.

You can take snapshots then save them to your camera roll or share them directly. Its quite a bit of fun to take silly pictures and send them to your mom. Your mom needs more silly hipster pictures, so download this app and give it a go! :]

Last Ninja

lastninja
The Last Ninja standing must fight the undead. Zombies galore on this highrise of pain.

Zombies are everywhere. Each floor you advance they get tougher and more cunning. But you’re a ninja! Cut them down with slashs and chops. Use special abilities to clear them with style.

Above all, keep jumping! With spikes on the floor growing ever closer, its death or zombies. You decide. ;]

TheNews

thenews
TheNews is an app that makes catching up on the daily news quick and minimal.

TheNews integrates with Designer News and HackNews in a clean, simple interface that makes reading a delight. Expected gesture based controls make managing your list a breeze.

Embedding web browsing, commenting, and sharing keep you in the app without needing to bounce around. Truly a one stop shop for quick news snippets.

Juggler: A Game About Juggling

juggler
I don’t know about you but juggling is hard. Juggler however, makes it a fun game.

Juggler is about not dropping any balls as they’re dropped on the screen. It supports full multitouch so you can juggle as many at a time as you can handle.

The higher your score the more balls thrown at you. You’ve got to be fast if you want to top the GameCenter charts.

AS Test

astest
AS Test measures your arithmetic skills with speed tests, offers training to improve, and global leaderboards to challenge.

With built in data tracking, AS Test can show you where you struggle on a question heat map. The heat map pinpoints problem areas based on all the questions you’ve been presented. Then using the training section you can hone your skills and battle for highest global score.

Speed counts, but so does difficulty. The more questions you answer in a time limit with the higher complexity the higher you’ll rank.

What the Hell?!

whatthehell
What would you do if you were a poor little devil kicked out of Demon Academy? Go to heaven of course!

Help this cute little devil jump his way to heaven through obstacles and puzzles. Use four powerups: fire, ice, blasting, and shrinking.

The game offers 30 completely free levels of addicitive sidescrolling platformer style gameplay.



Honorable Mentions

Each month tons of our readers submit awesome applications they’re made for me to review. While I give every app a try I don’t have time to write about them all. These are still great apps. Its not a popularity contest or even a favorite picking contest. I enjoy getting glimpse of the community through your apps. Take a moment to check out these great apps I just didn’t have enough time to share with you.

RebelChick
BubbleTT
Pirate Ring
Ball Smasher: The Big Bang
VBall ScoreMaster
County
Novae Marathon
Filetto RT
Retro Sparkle
Family Life on the Map
Newton’s Playground
Black Screen Video Spy Recorder
Make The Match!
Squirmy Puzzle
ArithMate
Cannon Bird
Banometer
Alchemistas Beyond The Veil
Animal Sounds & Name



Where To Go From Here?

Another month come and gone. I love seeing what our community of readers comes up with. The apps you build are the reason we keep writing tutorials. Make sure you tell me about your next one, submit here!

If you’ve never made an app, this is your month! Check out our free tutorials to become an iOS star. What are you waiting for – I want to see your app next month!

Readers’ App Reviews – July 2014 is a post from: Ray Wenderlich

The post Readers’ App Reviews – July 2014 appeared first on Ray Wenderlich.

How to Create a Game Like Cut the Rope Using Sprite Kit

$
0
0

Feed the Crocodile

Ever feed a pineapple to a crocodile? In this tutorial, you’ll find out how!

Prior to iOS 7, if you wanted to make a game, you either had to know a bit of arcane Open GL magic, or rely on a third party library to do the heavy lifting for you. Once you had your graphics engine in place, often times, you need to add an additional layer of physics to it, adding another library to simulate real world behavior. This could often result in additional dependencies as well as additional language requirements (such as C or C++).

With the introduction of iOS 7, Apple changed all of this. In one fell swoop, developers now had access to a 2D graphics and physics engine, all accessible through Objective-C. Now, developers could focus on making a game as opposed to managing the architecture of one.

The game framework was called SpriteKit and in this tutorial, you are going to learn how to make a game similar to Cut the Rope, an award-winning physics-based puzzle game. You’ll learn how to:

  • Add objects to scenes;
  • Create animation sequences;
  • Add music;
  • And, most of all, how to work within Sprite Kit’s physics world!

By the end of this tutorial, you’ll be well on your way to using Sprite Kit in your own projects.

Just keep in mind that this is not an entry level level tutorial. If classes such as SKNode or SKAction are entirely new to you, then check out our Sprite Kit Tutorial for Beginners. That tutorial will get you quickly up to speed so you can start feeding pineapples to crocodiles.

Wait. What?

Read on.

Getting Started

In this tutorial, you’ll be creating a game called Cut the Verlet. This game is modeled after Cut the Rope which tasks you to cut a rope holding a candy so it will drop into the mouth of a rather hungry, albeit, impatient creature. Each level adds additional challenges such as spiders and buzz saws. This game was demonstrated in a previous tutorial using Cocos2D, but in this tutorial, you will be using Sprite Kit to build it out.

Modeling Particle Trajectories

So what is a Verlet? A verlet is short for verlet integration which is way to model particle trajectories in motion and also a great tool to model your rope physics. Gustavo Ambrozio, the author of Cocos2D version of this tutorial, provides an excellent overview of verlets and how they are applied to this game. Give that section a read before continuing with this tutorial. Think of it as required reading. :]

To get started, first download the starter project for this tutorial. Extract the project to a convenient location on your hard drive and then open it in Xcode for a quick look at how it’s structured.

The project’s contents are in four main folders, as shown below:

ec_cut_the-rope_project_setup_condensed

  • Classes contains the primary files such as the main view controller, the scene and the rope object class. You will be adding to these classes throughout this tutorial.
  • Helpers contains the files responsible for maintaining the game’s data and its constants throughout the program.
  • Resources/Sounds contains the project’s sound files.
  • Resources/Other contains the files you need to add particles to the scene.
  • Other Resources contains all the resources from third-party sources. This tutorial will be using SKUtils which is a module that was created by the iOS Game by Tutorials team.
  • Images.xcassets contains the image assets for this project.

In addition, I’ve added all of the necessary #import statements to the starter project. This includes #import statements in the CutTheVerlet-Prefix.pch file.

Note: A .pch file is what is known as a precompiled header file. Precompiled headers are tool used to speed up the compilation of files. They also remove the requirement of importing those files when used throughout your project. Just bear in mind that pre-compiled headers can obscure dependencies between classes.

Close the Resources and Other Resources folders; you won’t be making any changes in those areas. You’ll work directly only with the files located in the Classes and Helpers folders.

It’s time to begin!

Creating Constants

A constant is a variable you can rely upon: once you set its value, that value never changes. Constants are a great way to make your code more manageable. Global constants in particular can make your code easier to read and maintain.

constant=bff

In this project, you’ll create global constants to define sprite texture image names, sound file names, sprite node names, the z-order or zPosition of your sprites and the category defined for each sprite, which you’ll use for collision detection.

Open TLCSharedConstants.h and add the following code above the @interface line:

typedef NS_ENUM(int, Layer)
{
    LayerBackground,
    LayerForeground,
    LayerCrocodile,
    LayerRope,
    LayerPrize
};
 
typedef NS_OPTIONS(int, EntityCategory)
{
    EntityCategoryCrocodile = 1 << 0,
    EntityCategoryRopeAttachment = 1 << 1,
    EntityCategoryRope = 1 << 2,
    EntityCategoryPrize = 1 << 3,
    EntityCategoryGround = 1 << 4
};
 
extern NSString *const kImageNameForRopeHolder;
extern NSString *const kImageNameForRopeTexture;
 
extern NSString *const kImageNameForCrocodileBaseImage;
extern NSString *const kImageNameForCrocodileMouthOpen;
extern NSString *const kImageNameForCrocodileMouthClosed;
 
extern NSString *const kSoundFileNameForCutAction;
extern NSString *const kSoundFileNameForSplashAction;
extern NSString *const kSoundFileNameForBiteAction;
 
extern NSString *const kSoundFileNameForBackgroundMusic;
 
extern NSString *const kImageNameForPrize;
extern NSString *const kNodeNameForPrize;

The code above declares two typedef variables of type int: EntityCategory and Layer. You’ll use these to determine the collision category and zPosition of a sprite when you add it to the scene—more about this soon.

The code also declares a group of constant NSString variables using the const keyword. The extern keyword comes in handy when creating global variables, as it allows you to create unambiguous declarations. That means you can declare a variable here but set its value elsewhere.

Solution Inside: Why do programmers name constants with a ‘k’ prefix? SelectShow>

Remember that part about declaring a variable here and setting it elsewhere? Well, the elsewhere in this case is TLCSharedConstants.m.

Open TLCSharedConstants.m and add the following code above the @implementation line:

NSString *const kImageNameForRopeHolder = @"ropeHolder";
NSString *const kImageNameForRopeTexture = @"ropeTexture";
 
NSString *const kImageNameForCrocodileBaseImage = @"croc";
NSString *const kImageNameForCrocodileMouthOpen = @"croc01";
NSString *const kImageNameForCrocodileMouthClosed = @"croc00";
 
NSString *const kSoundFileNameForCutAction = @"cut.caf";
NSString *const kSoundFileNameForSplashAction = @"splash.caf";
NSString *const kSoundFileNameForBiteAction = @"bite.caf";
 
NSString *const kSoundFileNameForBackgroundMusic = @"CheeZeeJungle.caf";
 
NSString *const kImageNameForPrize = @"pineapple";
NSString *const kNodeNameForPrize = @"pineapple";

Here you use string values to set the names of images and sound clips. If you’ve played Cut the Rope, you can probably figure out what the variable names represent. You also set a string value for the sprite node name that you’ll use in the collision detection methods, which the Collision Detection section of this tutorial will explain.

Now that you’ve got your constants in place, you can begin adding nodes to your scene, starting with the scenery itself—the background and foreground!

Note: While it may seem a bit premature to add some of these things, it’ll be easier to follow the tutorial if you define them right from the start. You’ll discover more about each one as you use them.

Adding the Background and Foreground

The starter project provides stub versions of the project’s methods—adding the code is your job. The first steps are to initialize the scene and add a background.

Open TLCMyScene.m and add the following properties to the interface declaration:

@property (nonatomic, strong) SKNode *worldNode;
@property (nonatomic, strong) SKSpriteNode *background;
@property (nonatomic, strong) SKSpriteNode *ground;
 
@property (nonatomic, strong) SKSpriteNode *crocodile;
 
@property (nonatomic, strong) SKSpriteNode *treeLeft;
@property (nonatomic, strong) SKSpriteNode *treeRight;

Here you define properties to hold references to the different nodes in the scene.

Now add the following block of code to initWithSize:, just after the comment that reads /* add setup here */:

self.worldNode = [SKNode node];
[self addChild:self.worldNode];
 
[self setupBackground];
[self setupTrees];

The code above creates an SKNode object and assigns it to the world property. It then adds the node to the scene using addChild:.

It also calls two methods, one for setting up the background and one for setting up the two trees. Because the two methods are almost identical, I’ll explain them together once you’ve added them.

First, locate setupBackground and add the following:

self.background = [SKSpriteNode spriteNodeWithImageNamed:@"background"];
self.background.anchorPoint = CGPointMake(0.5, 1);
self.background.position = CGPointMake(self.size.width/2, self.size.height);
self.background.zPosition = LayerBackground;
 
[self.worldNode addChild:self.background];
 
self.ground = [SKSpriteNode spriteNodeWithImageNamed:@"ground"];
self.ground.anchorPoint = CGPointMake(0.5, 1);
self.ground.position = CGPointMake(self.size.width/2, self.background.frame.origin.y);
self.ground.zPosition = LayerBackground;
 
[self.worldNode addChild:self.ground];
 
SKSpriteNode *water = [SKSpriteNode spriteNodeWithImageNamed:@"water"];
water.anchorPoint = CGPointMake(0.5, 1);
water.position = CGPointMake(self.size.width/2, self.ground.frame.origin.y + 10);
water.zPosition = LayerBackground;
 
[self.worldNode addChild:water];

Next, locate setupTrees and add this code:

self.treeLeft = [SKSpriteNode spriteNodeWithImageNamed:@"treeLeft"];
self.treeLeft.anchorPoint = CGPointMake(0.5, 1);
self.treeLeft.position = CGPointMake(self.size.width * .20, self.size.height);
self.treeLeft.zPosition = LayerForeground;
 
[self.worldNode addChild:self.treeLeft];
 
self.treeRight = [SKSpriteNode spriteNodeWithImageNamed:@"treeRight"];
self.treeRight.anchorPoint = CGPointMake(0.5, 1);
self.treeRight.position = CGPointMake(self.size.width * .86, self.size.height);
self.treeRight.zPosition = LayerForeground;
 
[self.worldNode addChild:self.treeRight];

Now that everything is in place, it’s time to explain.

In setupBackground and setupTrees, you create an SKSpriteNode and initialize it using spriteNodeWithImageNamed:, whereby you pass in the image name and assign it to its equivalently-named variable. In other words, you initialize the property variables.

You then change each of the anchorPoints from the default value of (0.5, 0.5) to a new value of (0.5, 1).

Note: For more information about the unit coordinate system, review Working with Sprites in Apple’s Sprite Kit Programming Guide.

You also set the sprite’s position (location) and zPosition (depth). For the most part, you’re only taking the size of the scene’s width and height into consideration when you set the sprite’s position.

The ground sprite, however, needs to be positioned directly under the edge of the background. You accomplish this by getting a handle on the background’s frame using self.background.frame.origin.y.

Likewise, you want the water sprite, which isn’t using a declared variable, directly under the ground sprite with a little space in between. You achieve this using self.ground.frame.origin.y + 10.

Recall that in TLCSharedConstants.h, you specified some constants for use with the sprite’s zPosition. You use two of them in the code above: LayerBackground and LayerForeground. Since SKSpriteNode inherits from SKNode, you have access to all of SKNode’s properties, including zPosition.

Finally, you add all of the newly created sprites to your world node.

While it’s possible to add sprites directly to a scene, creating a world node to contain things will be better in the long run, especially because you’re going to apply physics to the world.

You’ve got official approval for your first build and run! So… what are you waiting for?

Build and run your project. If you did everything right, you should see the following screen:

tc_cut_the-rope_build_01

It’s a lonely world out there. It’s time to bring out the crocodiles!

Adding and Animating the Crocodile Node

Adding the crocodile node is not much different from adding the background and foreground.

Locate setupCrocodile inside of TLCMyScene.m and add the following block of code:

self.crocodile = [SKSpriteNode spriteNodeWithImageNamed:kImageNameForCrocodileMouthOpen];
self.crocodile.anchorPoint = CGPointMake(0.5, 1);
self.crocodile.position = CGPointMake(self.size.width * .75, self.background.frame.origin.y + (self.crocodile.size.height - 5));
self.crocodile.zPosition = LayerCrocodile;
 
[self.worldNode addChild:self.crocodile];
 
[self animateCrocodile];

The code above uses two of the constants you set up earlier: kImageNameForCrocodileMouthClosed and LayerCrocodile. It also sets the position of the crocodile node based on the background node’s frame.origin.y location and the crocodile node’s size.

Just as before, you set the zPosition to place the crocodile node on top of the background and foreground. By default, Sprite Kit will “layer” nodes based on the order in which they’re added. You can choose a node’s depth yourself by giving it a different zPosition.

Now it’s time to animate the crocodile.

Find animateCrocodile and add the following code:

NSMutableArray *textures = [NSMutableArray arrayWithCapacity:1];
 
for (int i = 0; i <= 1; i++) {
    NSString *textureName = [NSString stringWithFormat:@"%@0%d", kImageNameForCrocodileBaseImage, i];
    SKTexture *texture = [SKTexture textureWithImageNamed:textureName];
    [textures addObject:texture];
}
 
CGFloat duration = RandomFloatRange(2, 4);
 
SKAction *move = [SKAction animateWithTextures:textures timePerFrame:0.25];
SKAction *wait = [SKAction waitForDuration:duration];
SKAction *rest = [SKAction setTexture:[textures objectAtIndex:0]];
 
SKAction *animateCrocodile = [SKAction sequence:@[wait, move, wait, rest]];
[self.crocodile runAction: [SKAction repeatActionForever:animateCrocodile]];

The previous code creates an array of SKTexture objects which you then animate using SKAction objects.

You also use a constant to set the base image name and set the number of images in your animation using a for loop. There are only two images for this animation, croc00 and croc01. Finally, you use a series of SKAction objects to animate the crocodile node.

An SKAction sequence: allows you to set multiple actions and run them as… you guessed it… a sequence!

Once you’ve established the sequence, you run the action on the node using the node’s runAction method. In the code above, you use repeatActionForever: to instruct the node to animate indefinitely.

The final step in adding and animating your crocodile is to make the call to setupCrocodile. You’ll do this in initWithSize:.

Toward the top of TLCMyScene.m, locate initWithSize: and add the following line after [self setupTrees];:

[self setupCrocodile];

That’s it! Prepare yourself to see a mean-looking crocodile wildly open and shut its jaws in the hope of eating whatever may be around.

Build and run the project to see this fierce reptile in action!

tc_cut_the-rope_build_02

That’s pretty scary, right? As the player, it’s your job to keep this guy happy with pineapple, which everyone knows is a crocodile’s favorite food. ;]

If your screen doesn’t look like the one above, you may have missed a step somewhere along the way.

You’ve got scenery and you’ve got a player, so let’s institute some ground rules to get this party started—physics!

Adding Physics to Your World

SpriteKit makes use of iOS’ packaged physics engine which in reality is just Box 2D under the covers. If you’ve ever used Cocos-2D, then you may have used Box 2D for managing your physics. The big difference between using Box 2D in Sprite Kit is that Apple has encapsulated the library in an Objective-C wrapper so you won’t need to use C++ to access it.

To get started, locate initWithSize: inside of TLCMyScene.m and add the following three lines after the [self add child:self.worldNode] line:

self.worldNode.scene.physicsWorld.contactDelegate = self;
self.worldNode.scene.physicsWorld.gravity = CGVectorMake(0.0,-9.8);
self.worldNode.scene.physicsWorld.speed = 1.0;

Also, add the following to the end of the @interface line at the top of the file:

<SKPhysicsContactDelegate>

The code above sets the world node’s contact delegate, gravity and speed. Remember, this is the node that will contain all your other nodes, so it makes sense to add your physics here.

The gravity and speed values above are the defaults for their respective properties. The former specifies the gravitational acceleration applied to physics bodies in the world, while the latter specifies the speed at which the simulation executes. Since they’re the default values, you don’t need to specify them above, but it’s good to know they exist in case you want to tweak your physics.

Both of these properties can be found in the SKPhysicsWorld Class Reference.

Get Ready for the Ropes!

Now for the part you’ve been eagerly anticipating… the ropes! Excuse me—I mean, the verlets.

This project uses the TLCGameData class as a means of setting up the ropes. In a production environment, you’d likely use a PLIST or some other data store to configure the levels.

In a moment, you’re going to create an array of TLCGameData objects to represent your datastore.

Open TLCGameData.h and add the following properties:

@property (nonatomic, assign) int name;
 
@property (nonatomic, assign) CGPoint ropeLocation;
@property (nonatomic, assign) int ropeLength;
 
@property (nonatomic, assign) BOOL isPrimaryRope;

These will serve as your data model. Again, in a production environment, you’d benefit from using a PLIST rather than programmatically creating your game data.

Now go back to TLCMyScene.m and add the following after the last #import statement:

#define lengthOfRope1 24
#define lengthOfRope2 18
#define lengthOfRope3 15

Then, add two properties to hold the prize and level data. Add them right after the other properties:

@property (nonatomic, strong) SKSpriteNode *prize;
 
@property (nonatomic, strong) NSMutableArray *ropes;

Once you’ve done that, locate setupGameData and add the following block of code:

self.ropes = [NSMutableArray array];
 
TLCGameData *rope1 = [[TLCGameData alloc] init];
rope1.name = 0;
rope1.ropeLocation = CGPointMake(self.size.width *.12, self.size.height * .94);
rope1.ropeLength = lengthOfRope1;
rope1.isPrimaryRope = YES;
[self.ropes addObject:rope1];
 
TLCGameData *rope2 = [[TLCGameData alloc] init];
rope2.name = 1;
rope2.ropeLocation = CGPointMake(self.size.width *.85, self.size.height * .90);
rope2.ropeLength = lengthOfRope2;
rope2.isPrimaryRope = NO;
[self.ropes addObject:rope2];
 
TLCGameData *rope3 = [[TLCGameData alloc] init];
rope3.name = 2;
rope3.ropeLocation = CGPointMake(self.size.width *.86, self.size.height * .76);
rope3.ropeLength = lengthOfRope3;
rope3.isPrimaryRope = NO;
[self.ropes addObject:rope3];

The code above sets basic parameters for your ropes. The most important is the property isPrimaryRope, because it determines how the ropes are connected to the prize. When creating your ropes, only one should have this property set to YES.

Finally, add two more calls to initWithSize:: [self setupGameData] and [self setupRopes]. When you’re done, initWithSize: should look like this:

if (self = [super initWithSize:size]) {
    /* Setup your scene here */
 
    self.worldNode = [SKNode node];
    [self addChild:self.worldNode];
 
    self.worldNode.scene.physicsWorld.contactDelegate = self;
    self.worldNode.scene.physicsWorld.gravity = CGVectorMake(0.0,-9.8);
    self.worldNode.scene.physicsWorld.speed = 1.0;
 
    [self setupSounds];
    [self setupGameData];
 
    [self setupBackground];
    [self setupTrees];
    [self setupCrocodile];
 
    [self setupRopes];
}
 
return self;

Now you can build the ropes!

verlets_please

The Rope Class

In this section, you’ll begin creating the class that handles the ropes.

Open TLCRope.h. You’re going to add two blocks of code to this file. Add the first block, the delegate’s protocol, before the @interface section:

@protocol TLCRopeDelegate
- (void)addJoint:(SKPhysicsJointPin *)joint;
@end

Add the second block of code, which includes the declaration of a custom init method, after the @interface section:

@property (strong, nonatomic) id<TLCRopeDelegate> delegate;
 
- (instancetype)initWithLength:(int)length usingAttachmentPoint:(CGPoint)point toNode:(SKNode*)node withName:(NSString *)name withDelegate:(id<TLCRopeDelegate>)delegate;
- (void)addRopePhysics;
 
- (NSUInteger)getRopeLength;
- (NSMutableArray *)getRopeNodes;

Delegation requires one object to define a protocol containing methods to which it expects its delegate to respond. The delegate class must then declare that it follows this protocol and implement the required methods.

Note: While you don’t need a delegate for the rope object to correctly function, this tutorial includes a delegate to show how to implement one. The final project includes a commented line showing an alternate method that foregoes delegates.

Once you’ve finished your header file, open TLCRope.m and add the following properties to the @interface section:

@property (nonatomic, strong) NSString *name;
 
@property (nonatomic, strong) NSMutableArray *ropeNodes;
 
@property (nonatomic, strong) SKNode *attachmentNode;
@property (nonatomic, assign) CGPoint attachmentPoint;
 
@property (nonatomic, assign) int length;

Your next step is to add the code for the custom init. Locate #pragma mark Init Method and add the following block of code:

- (instancetype)initWithLength:(int)length usingAttachmentPoint:(CGPoint)point toNode:(SKNode*)node withName:(NSString *)name withDelegate:(id<TLCRopeDelegate>)delegate;
{
    self = [super init];
    if (self)
    {
        self.delegate = delegate;
 
        self.name = name;
 
        self.attachmentNode = node;
        self.attachmentPoint = point;
 
        self.ropeNodes = [NSMutableArray arrayWithCapacity:length];
 
        self.length = length;
    }
    return self;
}

This is simple enough: You take the values that you passed into init and use them to set the private properties in your class.

The next two methods you need to add are getRopeLength and getRopeNodes.

Locate #pragma mark Helper Methods and add the following:

- (NSUInteger)getRopeLength
{
    return self.ropeNodes.count;
}
 
- (NSMutableArray *)getRopeNodes
{
    return self.ropeNodes;
}

The two methods above serve as a means of reading the values of the private properties.

You may have noticed that the code above refers to rope nodes, plural. That’s because in this game, each of your ropes will be made up of many nodes to give it a fluid look and feel. Let’s see how that will work in practice.

JoinorDie

Creating the Parts of the Rope

Although the next few methods you write will be incomplete, it’s for a good reason: to fully understand what’s happening, it’s important to take things step by step.

The first step is to add the rope parts, minus the physics.

Still working in TLCRope.m, locate #pragma mark Setup Physics and add the following method:

- (void)addRopePhysics
{
    // keep track of the current rope part position
    CGPoint currentPosition = self.attachmentPoint;
 
    // add each of the rope parts
    for (int i = 0; i < self.length; i++) {
        SKSpriteNode *ropePart = [SKSpriteNode spriteNodeWithImageNamed:kImageNameForRopeTexture];
        ropePart.name = self.name;
        ropePart.position = currentPosition;
        ropePart.anchorPoint = CGPointMake(0.5, 0.5);
 
        [self addChild:ropePart];
        [self.ropeNodes addObject:ropePart];
 
        /* TODO - Add Physics Here */
 
        // set the next rope part position
        currentPosition = CGPointMake(currentPosition.x, currentPosition.y - ropePart.size.height);
    }
}

In the code above, which you’ll call after the object has been initialized, you create each rope part and add it to the ropeNodes array. You also give each part a name so you can reference it later. Finally, you add it as a child of the actual TLCRope object using addChild:.

Soon, you’ll replace the TODO comment above with some code to give these rope parts their own physics bodies.

show_me_the_ropes

Now that you have everything in place, you’re almost ready to build and run to see your ropes. The final step is to add the ropes and the attached prize to the main game scene, which is exactly what you’re about to do.

Adding the Ropes and Prize to the Scene

Since the project uses a delegate pattern for TLCRope, you need to declare this in whatever class will act as its delegate. In this case, it’s TLCMyScene.

Open TLCMyScene.m and locate the @interface line. Change it to read as follows:

@interface TLCMyScene() <SKPhysicsContactDelegate, TLCRopeDelegate>

You’ll work with three methods to add the ropes to the scene, and they are all interconnected: setupRopes, addRopeAtPosition:withLength:withName and setupPrizeUsingPrimaryRope.

Starting with the first, locate setupRopes and add the following block of code:

// get ropes data
    for (int i = 0; i < [self.ropes count]; i++) {
        TLCGameData *currentRecord = [self.ropes objectAtIndex:i];
 
        // 1
        TLCRope *rope = [self addRopeAtPosition:currentRecord.ropeLocation withLength:currentRecord.ropeLength withName:[NSString stringWithFormat:@"%i", i]];
 
        // 2
        [self.worldNode addChild:rope];
        [rope addRopePhysics];
 
        // 3
        if (currentRecord.isPrimaryRope) {
            [self setupPrizeUsingPrimaryRope:rope];
        }
    }
 
    self.prize.position = CGPointMake(self.size.width * .50, self.size.height * .80);

Here’s the breakdown:

  1. First you created a new rope passing in the location and the length. You will be writing this method in just a moment.
  2. The next section adds the rope to the world and then sets up the physics.
  3. This final section of code sets the physics for the prize so long as the rope is the primary one.

Locate addRopeAtPosition:withLength:withName and replace its current contents with this block of code:

SKSpriteNode *ropeHolder = [SKSpriteNode spriteNodeWithImageNamed:kImageNameForRopeHolder];
 
ropeHolder.position = location;
ropeHolder.zPosition = LayerRope;
 
[self.worldNode addChild:ropeHolder];
 
CGPoint ropeAttachPos = CGPointMake(ropeHolder.position.x, ropeHolder.position.y -8);
 
TLCRope *rope = [[TLCRope alloc] initWithLength:length usingAttachmentPoint:ropeAttachPos toNode:ropeHolder withName:name withDelegate:self];
rope.zPosition = LayerRope;
rope.name = name;
 
return rope;

Essentially, you’re using this method to create the individual ropes to display in your scene.

You’ve seen everything that’s happening here before. First you initialize an SKSpriteNode using one of the constants for the image name, and then you set its position and zPosition with constants. This SKSpriteNode will act as the “holder” for your rope.

The code then continues to initialize your rope object and set its zPosition and name.

Finally, the last piece of the puzzle gets you the prize! Clever, isn’t it? =]

Locate setupPrizeUsingPrimaryRope and add the following:

self.prize = [SKSpriteNode spriteNodeWithImageNamed:kImageNameForPrize];
self.prize.name = kNodeNameForPrize;
self.prize.zPosition = LayerPrize;
 
self.prize.anchorPoint = CGPointMake(0.5, 1);
 
SKNode *positionOfLastNode = [[rope getRopeNodes] lastObject];
self.prize.position = CGPointMake(positionOfLastNode.position.x, positionOfLastNode.position.y + self.prize.size.height * .30);
 
[self.worldNode addChild:self.prize];

You may have noticed in setupRopes and in the game data’s rope object a property for isPrimaryRope. This property lets you loop through the data and select a rope to use as the primary rope for connecting to the prize.

When you set isPrimaryRope to YES, the code above executes and finds the end of the passed-in rope object by getting the last object in the rope’s ropeNodes array. It does this using the helper method getRopeNodes from the TLCRope class.

Note: If you’re wondering why the prize’s position moves to the last node within the TLCRope object, this is due to a bug in Sprite Kit. When you set a physics body, if the position of the node is not set beforehand, the body will behave unpredictably. The previous code uses the last rope part, of a single rope (of your choosing) to use as it’s initial position

And now for the moment you’ve been waiting for … Build and run your project!

tc_cut_the-rope_build_03

Wait… Why isn’t the pineapple attached to the ropes? Why does everything look so stiff?

Don’t worry! The solution to these problems is… more physics!

Adding Physics Bodies to the Ropes

Remember that TODO comment you added earlier? It’s time to replace that with some physics to get things moving!

Open TLCRope.m and locate addRopePhysics. Replace the TODO comment with the following code:

CGFloat offsetX = ropePart.frame.size.width * ropePart.anchorPoint.x;
CGFloat offsetY = ropePart.frame.size.height * ropePart.anchorPoint.y;
 
CGMutablePathRef path = CGPathCreateMutable();
 
CGPathMoveToPoint(path, NULL, 0 - offsetX, 7 - offsetY);
CGPathAddLineToPoint(path, NULL, 7 - offsetX, 7 - offsetY);
CGPathAddLineToPoint(path, NULL, 7 - offsetX, 0 - offsetY);
CGPathAddLineToPoint(path, NULL, 0 - offsetX, 0 - offsetY);
 
CGPathCloseSubpath(path);
 
ropePart.physicsBody = [SKPhysicsBody bodyWithPolygonFromPath:path];
ropePart.physicsBody.allowsRotation = YES;
ropePart.physicsBody.affectedByGravity = YES;
 
ropePart.physicsBody.categoryBitMask = EntityCategoryRope;
ropePart.physicsBody.collisionBitMask = EntityCategoryRopeAttachment;
ropePart.physicsBody.contactTestBitMask =  EntityCategoryPrize;
 
[ropePart skt_attachDebugFrameFromPath:path color:[SKColor redColor]];
CGPathRelease(path);

The code above creates a physics body for each of your rope parts, allowing you to set a series of physical characteristics for each node, like shape, size, mass, gravity and friction effects.

Physics bodies are created using the class method SKPhysicsBody bodyWithPolygonFromPath:. This method takes one parameter: a path. A handy online tool for generating a path is SKPhysicsBody Path Generator.

Note: Refer to the SKPhysicsBody Class Reference for more information about using physics bodies in your projects.

In addition to setting the physics bodies for each node, the code above also sets some key properties to handle collisions: categoryBitMask, collisionBitMask and contactTestBitMask. Each is assigned one of the constants you defined earlier. The tutorial will cover these properties in depth later.

If you were to run your app right now, each rope component would fall to the bottom of your screen. That’s because you’ve added a physics body to each but have yet to connect them together.

To fuse your rope, you’re going to use SKPhysicsJoints. Add the following method below addRopePhysics:

- (void)addRopeJoints
{
    // setup joint for the initial attachment point
    SKNode *nodeA = self.attachmentNode;
    SKSpriteNode *nodeB = [self.ropeNodes objectAtIndex:0];
 
    SKPhysicsJointPin *joint = [SKPhysicsJointPin jointWithBodyA: nodeA.physicsBody
                                                           bodyB: nodeB.physicsBody
                                                          anchor: self.attachmentPoint];
 
    // force the attachment point to be stiff
    joint.shouldEnableLimits = YES;
    joint.upperAngleLimit = 0;
    joint.lowerAngleLimit = 0;
 
    [self.delegate addJoint:joint];
 
    // setup joints for the rest of the rope parts
    for (int i = 1; i < self.length; i++) {
        SKSpriteNode *nodeA = [self.ropeNodes objectAtIndex:i-1];
        SKSpriteNode *nodeB = [self.ropeNodes objectAtIndex:i];
        SKPhysicsJointPin *joint = [SKPhysicsJointPin jointWithBodyA: nodeA.physicsBody
                                                               bodyB: nodeB.physicsBody
                                                              anchor: CGPointMake(CGRectGetMidX(nodeA.frame),
                                                                                  CGRectGetMinY(nodeA.frame))];
        // allow joint to rotate freely
        joint.shouldEnableLimits = NO;
        joint.upperAngleLimit = 0;
        joint.lowerAngleLimit = 0;
 
        [self.delegate addJoint:joint];
    }
}

This method connects all of the rope parts by using the SKPhysicsJoint class. This class allows two connected bodies to rotate independently around the anchor points, resulting in a “rope-like” feel.

You connect (anchor) the first rope part to the attachmentNode at the attachmentPoint and link each subsequent node to the one before.

Note: In the code above, there is a call to the delegate to add the joints to the scene. As I mentioned before, this call isn’t necessary. You could simply use [self.scene.physicsWorld addJoint:joint]; to accomplish the same thing.

Now add a call to this new method at the very bottom of addRopePhysics:

[self addRopeJoints];

Build and run. Ack!

Falling Ropes

While you have some nice fluid ropes, they don’t contribute much if they just fall off the screen. :] That’s because you haven’t set up the physics bodies on the nodes in TLCScene.m. It’s time to add physics bodies to the prize and the rope holders!

Anchoring the Ends of the Ropes

Open TLCMyScene.m and locate addRopeAtPosition:withLength:withName:. Right below the line [self.worldNode addChild:ropeHolder];, add the following block of code:

CGFloat offsetX = ropeHolder.frame.size.width * ropeHolder.anchorPoint.x;
CGFloat offsetY = ropeHolder.frame.size.height * ropeHolder.anchorPoint.y;
 
CGMutablePathRef path = CGPathCreateMutable();
 
CGPathMoveToPoint(path, NULL, 0 - offsetX, 6 - offsetY);
CGPathAddLineToPoint(path, NULL, 6 - offsetX, 6 - offsetY);
CGPathAddLineToPoint(path, NULL, 6 - offsetX, 0 - offsetY);
CGPathAddLineToPoint(path, NULL, 0 - offsetX, 0 - offsetY);
 
CGPathCloseSubpath(path);
 
ropeHolder.physicsBody = [SKPhysicsBody bodyWithPolygonFromPath:path];
ropeHolder.physicsBody.affectedByGravity = NO;
ropeHolder.physicsBody.dynamic = NO;
 
ropeHolder.physicsBody.categoryBitMask = EntityCategoryRopeAttachment;
ropeHolder.physicsBody.collisionBitMask = 0;
ropeHolder.physicsBody.contactTestBitMask =  EntityCategoryPrize;
 
[ropeHolder skt_attachDebugFrameFromPath:path color:[SKColor redColor]];
CGPathRelease(path);

Here you add an SKPhysicsBody for each of the rope holders and set their collision properties. You want the holders to act as solid anchors, which you achieve by disabling their affectedByGravity and dynamic properties.

Next, locate the delegate method, addJoint:, and add this line:

[self.worldNode.scene.physicsWorld addJoint:joint];

The above method adds the joints you just created in TLCRope.m to the scene. This is the line that holds the rope parts together!

The next step is to add a physics body to the prize and set its collision detection properties.

Locate setupPrizeUsingPrimaryRope:. Before the [self.worldNode addChild:self.prize]; line, add the following block of code:

CGFloat offsetX = self.prize.frame.size.width * self.prize.anchorPoint.x;
CGFloat offsetY = self.prize.frame.size.height * self.prize.anchorPoint.y;
 
CGMutablePathRef path = CGPathCreateMutable();
 
CGPathMoveToPoint(path, NULL, 18 - offsetX, 75 - offsetY);
CGPathAddLineToPoint(path, NULL, 5 - offsetX, 65 - offsetY);
CGPathAddLineToPoint(path, NULL, 3 - offsetX, 55 - offsetY);
CGPathAddLineToPoint(path, NULL, 4 - offsetX, 34 - offsetY);
CGPathAddLineToPoint(path, NULL, 8 - offsetX, 7 - offsetY);
CGPathAddLineToPoint(path, NULL, 21 - offsetX, 2 - offsetY);
CGPathAddLineToPoint(path, NULL, 33 - offsetX, 4 - offsetY);
CGPathAddLineToPoint(path, NULL, 38 - offsetX, 20 - offsetY);
CGPathAddLineToPoint(path, NULL, 34 - offsetX, 53 - offsetY);
CGPathAddLineToPoint(path, NULL, 36 - offsetX, 62 - offsetY);
 
CGPathCloseSubpath(path);
 
self.prize.physicsBody = [SKPhysicsBody bodyWithPolygonFromPath:path];
self.prize.physicsBody.allowsRotation = YES;
self.prize.physicsBody.affectedByGravity = YES;
self.prize.physicsBody.density = 1;
self.prize.physicsBody.dynamic = NO;
 
self.prize.physicsBody.categoryBitMask = EntityCategoryPrize;
self.prize.physicsBody.collisionBitMask = 0;
self.prize.physicsBody.contactTestBitMask = EntityCategoryRope;
 
[self.prize skt_attachDebugFrameFromPath:path color:[SKColor redColor]];
CGPathRelease(path);

Just like before, you add a physics body and set its collision detection properties.

To connect the prize to the end of the ropes, you need to do two things.

First, locate setupRopes. At the end of the for loop, add the following so that it’s the last line in the loop:

// connect the other end of the rope to the prize
[self attachNode:self.prize toRope:rope];

Then, locate attachNode:toRope and add the following block of code:

SKNode *previous = [[rope getRopeNodes] lastObject];
node.position = CGPointMake(previous.position.x, previous.position.y + node.size.height * .40);
 
SKSpriteNode *nodeAA = [[rope getRopeNodes] lastObject];
 
SKPhysicsJointPin *jointB = [SKPhysicsJointPin jointWithBodyA: previous.physicsBody
                                                        bodyB: node.physicsBody
                                                       anchor: CGPointMake(CGRectGetMidX(nodeAA.frame), CGRectGetMinY(nodeAA.frame))];
 
 
[self.worldNode.scene.physicsWorld addJoint:jointB];

The code above gets the last node from the TLCRope object and creates a new SKPhysicsJointPin to attach the prize.

Build and run the project. If all your joints and nodes are set up properly, you should see a screen similar to the one below.

tc_cut_the-rope_build_04

It looks good, right? Hmm… Maybe it’s a little stiff? Then again, maybe that’s the effect you want in your game. If not, you can give it a more fluid appearance.

Go to the top of TLCMyScene.m and add the following line below your other #define statements:

#define prizeIsDynamicsOnStart YES

Then, locate setupRopes and change the last two lines to this:

// reset prize position and set if dynamic; depends on your game play
self.prize.position = CGPointMake(self.size.width * .50, self.size.height * .80);
self.prize.physicsBody.dynamic = prizeIsDynamicsOnStart;

Build and run the project again.

ec_cut_the-rope_build_04b

Notice how much more fluid the ropes feel? Of course, it you prefer it the other way, change the value for prizeIsDynamicsOnStart to NO. It’s your game, after all! :]

A Few More Physics Bodies

Since you’ve already got physics bodies on your mind, it makes sense to set them up for the player and water nodes. Once you have those configured, you’ll be primed to start work on collision detection.

In TLCMyScene.m, locate setupCrocodile and add the following block of code just before the [self.worldNode addChild:self.crocodile]; line:

CGFloat offsetX = self.crocodile.frame.size.width * self.crocodile.anchorPoint.x;
CGFloat offsetY = self.crocodile.frame.size.height * self.crocodile.anchorPoint.y;
 
CGMutablePathRef path = CGPathCreateMutable();
CGPathMoveToPoint(path, NULL, 47 - offsetX, 77 - offsetY);
CGPathAddLineToPoint(path, NULL, 5 - offsetX, 51 - offsetY);
CGPathAddLineToPoint(path, NULL, 7 - offsetX, 2 - offsetY);
CGPathAddLineToPoint(path, NULL, 78 - offsetX, 2 - offsetY);
CGPathAddLineToPoint(path, NULL, 102 - offsetX, 21 - offsetY);
 
CGPathCloseSubpath(path);
 
self.crocodile.physicsBody = [SKPhysicsBody bodyWithPolygonFromPath:path];
 
self.crocodile.physicsBody.categoryBitMask = EntityCategoryCrocodile;
self.crocodile.physicsBody.collisionBitMask = 0;
self.crocodile.physicsBody.contactTestBitMask =  EntityCategoryPrize;
 
self.crocodile.physicsBody.dynamic = NO;
 
[self.crocodile skt_attachDebugFrameFromPath:path color:[SKColor redColor]];
CGPathRelease(path);

Just as with the rope nodes, you establish a path for your player node’s physics body and set its collision detection properties, each of which I’ll explain momentarily.

Last but not least, the water also needs a physics body so you can detect when the prize lands there rather than in the mouth of the hungry crocodile.

Locate setupBackground. Before the [self.worldNode addChild:water]; line, add the following block of code:

// make the size a little shorter so the prize will look like it’s landed in the water
CGSize bodySize = CGSizeMake(water.frame.size.width, water.frame.size.height -100);
 
water.physicsBody = [SKPhysicsBody bodyWithRectangleOfSize:bodySize];
water.physicsBody.dynamic = NO;
 
water.physicsBody.categoryBitMask = EntityCategoryGround;
water.physicsBody.collisionBitMask = EntityCategoryPrize;
water.physicsBody.contactTestBitMask =  EntityCategoryPrize;

Once again, you add a physics body, but this time you use bodyWithRectangleOfSize:. You also set the body’s collision detection properties.

Notice that you are assigning EntityCategoryGround as the categoryBitMask for the water object. In reality EntityCategoryGround represents the point of failure for your fruit as opposed to the physical ground. If you wanted to include additional traps such as spinning buzz saws, you would assign it EntityCategoryGround bit mask.

Note: You may have noticed a call to skt_attachDebugFrameFromPath: for most of the physics bodies. This is a method from SKNode+SKTDebugDraw, which is part of a group of Sprite Kit utilities developed by Razeware. This particular method helps with debugging physics bodies. To turn it on, open SKNode+SKTDebugDraw.m and change the line BOOL SKTDebugDrawEnabled = NO; to BOOL SKTDebugDrawEnabled = YES;. This will draw a shape that represents your physics body. Don’t forget to turn it off when you’re done!

tc_cut_the-rope_debug

Making the Cut

It can’t be named Cut the Verlet if your verlets have no fear of being cut, right?

In this section, you’re going to learn how to work with the touch methods that will allow your players to cut those ropes. The first step is to define some basic variables.

Still working in TLCMyScene.m, add the following properties to the @interface section:

@property (nonatomic, assign) CGPoint touchStartPoint;
@property (nonatomic, assign) CGPoint touchEndPoint;
@property (nonatomic, assign) BOOL touchMoving;

You’ll need these for tracking the user’s touches.

Then, add your final definition at the top of TLCMyScene.m:

#define canCutMultipleRopesAtOnce NO

This will be useful if you want to make changes to the way the game functions.

iOS incudes a few methods that deal with handling touch events. You’ll be working with three: touchesBegan:withEvent:, touchesEnded:withEvent: and touchesMoved:withEvent:.

Locate touchesBegan:withEvent: and add the following code:

self.touchMoving = NO;
 
for (UITouch *touch in touches) {
    self.touchStartPoint = [touch locationInNode:self];
}

The code above sets the variable based on the location of the user’s touch.

Next, locate touchesEnded:withEvent: and add this:

for (UITouch *touch in touches) {
    if (touches.count == 1 && self.touchMoving) {
        self.touchEndPoint = [touch locationInNode:self];
 
        if (canCutMultipleRopesAtOnce) {
            /* allow multiple ropes to be cut */
 
            [self.worldNode.scene.physicsWorld enumerateBodiesAlongRayStart:self.touchStartPoint end:self.touchEndPoint usingBlock:^(SKPhysicsBody *body, CGPoint point, CGVector normal, BOOL *stop)
             {
                 [self checkRopeCutWithBody:body];
             }];
        }
        else {
            /* allow only one rope to be cut */
 
            SKPhysicsBody *body = [self.worldNode.scene.physicsWorld bodyAlongRayStart:self.touchStartPoint end:self.touchEndPoint];
            [self checkRopeCutWithBody:body];
        }
    }
}
 
self.touchMoving = NO;

This code does a few things. First, it makes sure the user is touching the screen with only one finger, and then it determines if the user is moving that finger. Finally, it retrieves and sets the property touchEndPoint. With that information, you can take the appropriate action based on whether you’re allowing only one rope or multiple ropes to be cut with a single swipe.

To cut multiple ropes, you use SKPhysicsWorld’s method enumerateBodiesAlongRayStart:end:usingBlock: to capture multiple touch points. To cut a single rope, you use bodyAlongRayStart:end: to get only the first touch point. Then you pass that information to the custom method, checkRopeCutWithBody:.

Finally, locate touchesMoved:withEvent: and add this code:

if (touches.count == 1) {
    for (UITouch *touch in touches) {
        NSString *particlePath = [[NSBundle mainBundle] pathForResource:@"TLCParticle" ofType:@"sks"];
        SKEmitterNode *emitter = [NSKeyedUnarchiver unarchiveObjectWithFile:particlePath];
        emitter.position = [touch locationInNode:self];
        emitter.zPosition = LayerRope;
        emitter.name = @"emitter";
 
        [self.worldNode addChild:emitter];
 
        self.touchMoving = YES;
    }
}

Technically, you don’t need most of the code above, but it does provide for really cool effects when your users swipe the screen. You do, however, need to set the touchMoving property to YES, as the code above does. As you saw earlier, you’re evaluating this variable to determine if the user is moving her finger.

So, what does the rest of the code do?

It uses an SKEmitterNode to automatically generate awesome green particles that appear onscreen whenever the user swipes.

The code above loads a particle file and adds it to the worldNode. Particle emitters are not within the scope of this tutorial, but now that you know they exist… you’ve got something else to do later. =]

With the touch events complete, it’s time to finish the method that you call in touchesEnded:withEvent:.

Locate checkRopeCutWithBody: and add the following block of code:

SKNode *node = body.node;
if (body) {
    self.prize.physicsBody.affectedByGravity = YES;
    self.prize.physicsBody.dynamic = YES;
 
    [self.worldNode enumerateChildNodesWithName:node.name usingBlock:^(SKNode *node, BOOL *stop)
     {
         for (SKPhysicsJoint *joint in body.joints) {
             [self.worldNode.scene.physicsWorld removeJoint:joint];
         }
 
         SKSpriteNode *ropePart = (SKSpriteNode *)node;
 
         SKAction *fadeAway = [SKAction fadeOutWithDuration:0.25];
         SKAction *removeNode = [SKAction removeFromParent];
 
         SKAction *sequence = [SKAction sequence:@[fadeAway, removeNode]];
         [ropePart runAction: sequence];
     }];
}

The code above enumerates through all the child nodes in worldNode and if it comes across any joints, it removes them. Rather than remove them abruptly, it uses an SKAction sequence to fade out the node first.

Build and run the project. You should be able to swipe and cut all three ropes—as well as the prize (for now). Toggle the canCutMultipleRopesAtOnce setting to see how the behavior differs. By the way, aren’t those particles awesome?

tc_cut_the-rope_build_05

Collision Detection

You’re almost done! You’ve got swiping in place, physics out of the way, and you’ve specified all of the collision properties—but what exactly do they mean? How do they work? And, more importantly, how do they work with one another?

Here are a few key things to note:

  1. You need to specify that the TLCMyScene acts as a contact delegate: SKPhysicsContactDelegate.
  2. You need to set the delegate on the world node: self.worldNode.scene.physicsWorld.contactDelegate = self.
  3. You need to specify a categoryBitMask, a collisionBitMask and a contactTestBitMask.
  4. You need to implement the delegate methods.

You did the first two when you set up the physics for worldNode. You took care of the third when you set up the rest of the node’s physics bodies. That was excellent foresight on your part! =]

That leaves number four on the list: implement the methods. Before doing so, however, you need to create a few properties.

In the @implementation section of TLCMyScene.m, add the following:

@property (nonatomic, assign) BOOL scoredPoint;
@property (nonatomic, assign) BOOL hitGround;

Now you’re ready to modify the delegate method.

Locate didBeginContact: and add the following block of code:

SKPhysicsBody *other = (contact.bodyA.categoryBitMask == EntityCategoryPrize ? contact.bodyB : contact.bodyA);
if (other.categoryBitMask == EntityCategoryCrocodile) {
    if (!self.hitGround) {
        NSLog(@"scoredPoint");
        self.scoredPoint = YES;
    }
 
    return;
}
else if (other.categoryBitMask == EntityCategoryGround) {
    if (!self.scoredPoint) {
        NSLog(@"hitGround");
        self.hitGround = YES;
        return;
    }
}

The code above executes anytime the scene’s physicsWorld detects a collision. It checks the body’s categoryBitMask and, based on its value, either scores a point or registers a ground hit.

The three settings, categoryBitMask, collisionBitMask and contactTestBitMask, all work in tandem with one another.

  • The categoryBitMask sets the category to which the sprite belongs.
  • The collisionBitMask sets the category with which a sprite may collide.
  • The contactTestBitMask defines which categories trigger notifications to the delegate.

Check the SKPhysicsBody Class Reference for more detailed information.

This game uses five categories, as defined within the TLCSharedConstants class. Open TLCSharedConstants.m and take another look. You will see some collision categories you set set up earlier in the tutorial.

EntityCategoryCrocodile = 1 << 0,
EntityCategoryRopeAttachment = 1 << 1,
EntityCategoryRope = 1 << 2,
EntityCategoryPrize = 1 << 3,
EntityCategoryGround = 1 << 4

You want to detect when the prize collides with the crocodile node and when the prize collides with the water. You’re not going to award points or end the game based on any contact with the rope nodes, but setting categories for the rope and rope attachment points will help make the rope look more realistic at its attachment point.

Note: There are additional delegate methods that you don’t need for this tutorial. To learn about them, review the SKPhysicsContact Class Reference and the SKPhysicsContactDelegate Protocol Reference.

Build and run the project. When you cut the ropes, you should see log statements corresponding with where the prize lands.

ec_cut_the-rope_logs

Bonus Animation!

While you do have the faint outline of game, users are not going to be staring at their console to see the win or fail condition. Also, if you cut the correct rope, the fruit falls through the crocodile as opposed to the crocodile eating it. Users will expect to see the crocodile munch down that pineapple.

It’s time to fulfill that expectation with animation. To do this, you’ll modify nomnomnomActionWithDelay:.

In TLCMyScene.m, find nomnomnomActionWithDelay: and add the following block of code:

[self.crocodile removeAllActions];
 
SKAction *openMouth = [SKAction setTexture:[SKTexture textureWithImageNamed:kImageNameForCrocodileMouthOpen]];
SKAction *wait = [SKAction waitForDuration:duration];
SKAction *closeMouth = [SKAction setTexture:[SKTexture textureWithImageNamed:kImageNameForCrocodileMouthClosed]];
 
SKAction *nomnomnomAnimation = [SKAction sequence:@[openMouth, wait, closeMouth]];
 
[self.crocodile runAction: [SKAction repeatAction:nomnomnomAnimation count:1]];
 
if (!self.scoredPoint) {
    [self animateCrocodile];
}

The code above removes any animation currently running on the crocodile node using removeAllActions. It then creates a new animation sequence that opens and closes the crocodile’s mouth and runs this sequence on the crocodile. At that point, if the player hasn’t scored a point, it runs animateCrocodile, which resets the random opening and closing of the crocodile’s jaw.

Next, locate checkRopeCutWithBody: and, after the self.prize.physicsBody.dynamic = YES; line, add the following two lines of code:

[self nomnomnomActionWithDelay:1];

This code executes every time the user cuts a rope. It runs the method you just created. The animation gives the illusion that the crocodile is opening its mouth in hope something yummy will fall into it.

You also need to run this method in didBeginContact: so that when the prize touches the crocodile, he opens his mouth to eat it.

In didBeginContact:, after the self.scoredPoint = YES; line, add the following line:

[self nomnomnomActionWithDelay:.15];

Just as before, you run nomnomnomActionWithDelay, except this time you run it when the prize collides with the crocodile. This makes the crocodile appear to eat the prize.

Build and run.

Missed Opportunity

The food falls right through the crocodile. You can fix this by making a few simple changes.

Locate checkForScore and add the following block of code:

if (self.scoredPoint) {
    self.scoredPoint = NO;
 
    SKAction *shrink = [SKAction scaleTo:0 duration:0.08];
    SKAction *removeNode = [SKAction removeFromParent];
 
    SKAction *sequence = [SKAction sequence:@[shrink, removeNode]];
    [self.prize runAction: sequence];
}

The code above checks the value of the scoredPoint property. If this is set to YES, the code sets it back to NO, runs the action to play the nomnomnom sound and then removes the prize from the scene using an SKAction sequence.

You want this code to execute continually to keep track of your variable. To make that happen, you need to modify update:.

Locate update: and add the following line:

[self checkForScore];

update: invokes before each frame of the animation renders. Here you call the method that checks if the player scored a point.

The next thing you need to do is check for a ground hit. Locate checkForGroundHit and add the following block of code:

if (self.hitGround) {
    self.hitGround = NO;
 
    SKAction *shrink = [SKAction scaleTo:0 duration:0.08];
    SKAction *removeNode = [SKAction removeFromParent];
 
    SKAction *sequence = [SKAction sequence:@[shrink, removeNode]];
    [self.prize runAction: sequence];
}

Almost like checkForScore, this code checks the value of hitGround. If the value is YES, the code resets it to NO, runs the action to play the splash sound and then removes the prize from the scene using an SKAction sequence.

Once again, you need to call this method from update:. Locate update: and add the following line:

[self checkForGroundHit];

With everything in place, build and run the project.

ec_cut_the-rope_end

You should see and hear all of the fabulous things you added. But, you’ll also notice that once you score a point or miss the crocodile’s mouth, the game just hangs there. You can fix that!

Adding a Scene Transition

In TLCMyScene.m, find switchToNewGameWithTransition: and add the following block of code:

SKView *skView = (SKView *)self.view;
 
TLCMyScene *scene = [[TLCMyScene alloc] initWithSize:self.size];
[skView presentScene:scene transition:transition];

The code above uses SKView’s presentScene:transition: to present the next scene.

In this case, you present TLCMyScene. You also pass in a transition using the SKTransition class.

You need to call this method in two places: checkForScore and checkForGroundHit.

In checkForGroundHit, add the following line of code at the end of the if statement (within the braces):

SKTransition *sceneTransition = [SKTransition fadeWithDuration:1.0];
[self performSelector:@selector(switchToNewGameWithTransition:) withObject:sceneTransition afterDelay:1.0];

Next, in checkForScore, add the following line of code, also at the end of the if statement (but in between the braces):

/* Various kinds of scene transitions */
 
        NSArray * transitions = @[[SKTransition doorsOpenHorizontalWithDuration:1.0],
                                  [SKTransition doorsOpenVerticalWithDuration:1.0],
                                  [SKTransition doorsCloseHorizontalWithDuration:1.0],
                                  [SKTransition doorsCloseVerticalWithDuration:1.0],
                                  [SKTransition flipHorizontalWithDuration:1.0],
                                  [SKTransition flipVerticalWithDuration:1.0],
                                  [SKTransition moveInWithDirection:SKTransitionDirectionLeft duration:1.0],
                                  [SKTransition pushWithDirection:SKTransitionDirectionRight duration:1.0],
                                  [SKTransition revealWithDirection:SKTransitionDirectionDown duration:1.0],
                                  [SKTransition crossFadeWithDuration:1.0],
                                  [SKTransition doorwayWithDuration:1.0],
                                  [SKTransition fadeWithColor:[UIColor darkGrayColor] duration:1.0],
                                  [SKTransition fadeWithDuration:1.0]
                                  ];
        int randomIndex = arc4random_uniform((int) transitions.count);
 
[self performSelector:@selector(switchToNewGameWithTransition:) withObject:transitions[randomIndex] afterDelay:1.0];

The code above includes all of the available transitions, stored in an NSArray. The code then selects a random transition by using the arc4random_uniform function. The random transition is then provided to switchToNewGameWithTransition: so you should see a different transition after each game.

Now build and run the project.

ec_cut_the-rope_transition

You should see the scene transition to a new one whenever the player scores a point or loses the prize.

You need to make one final modification to handle when the prize leaves the screen. This can happen if the user cuts the ropes in such a way as to “throw” the prize off the screen.

To handle this case, add the following code to checkForPrize:

[self.worldNode enumerateChildNodesWithName:kNodeNameForPrize usingBlock:^(SKNode *node, BOOL *stop)
 {
     if (node.position.y <= 0) {
         [node removeFromParent];
         self.hitGround = YES;
     }
 }];

The code above enumerates through the child nodes in the worldNode to find one that matches the specified constant, which in this case is the name of the prize node. If the code finds the right node, it assumes the node has not made contact with the player and therefore sets the variable hitGround to YES.

Again, you need to add a call to checkForPrize in update:. Adding the following line to update::

[self checkForPrize];

Finally, remember that your user can still swipe the prize to score an easy victory. You may have noticed this in your testing. I call this the Cheater Bug. To fix this, locate checkRopeCutWithBody: and add the following just above the for loop line (for (SKPhysicsJoint *joint in body.joints) { … }):

if ([node.name isEqualToString:kNodeNameForPrize]) {
    return;
}

The code above checks if the user has swiped the prize node by looking for a name match. If there is a match, the method returns and does nothing.

Life Without Music and Sound. So Boring!

While the game is technically complete, it lacks a certain pop. A silent game may quickly bore your users. It’s time to add a little “juice” to make things pop.

I’ve selected a nice jungle song from incompetech.com and some sound effects from freesound.org.

Because this game will play music in the background, it makes sense to use a single AVPlayer in the App Delegate. You don’t need to add it because the starter project already contains a property in TLCAppDelegate.h for an AVAudioPlayer (backgroundMusicPlayer). You simply need to add the playBackgroundMusic: method and then call that method.

Open TLCMyScene.m and locate playBackgroundMusic. Add the following code:

NSError *error;
NSURL *backgroundMusicURL = [[NSBundle mainBundle] URLForResource:filename withExtension:nil];
 
TLCAppDelegate *appDelegate = (TLCAppDelegate *)[[UIApplication sharedApplication] delegate];
 
if (!appDelegate.backgroundMusicPlayer) // not yet initialized, go ahead and set it up
{
    appDelegate.backgroundMusicPlayer = nil;
    appDelegate.backgroundMusicPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:backgroundMusicURL error:&error];
    appDelegate.backgroundMusicPlayer.numberOfLoops = -1;
    appDelegate.backgroundMusicPlayer.volume = 1.0;
    [appDelegate.backgroundMusicPlayer prepareToPlay];
}
 
if (!appDelegate.backgroundMusicPlayer.isPlaying) // is it currently playing? if not, play music
{
    [appDelegate.backgroundMusicPlayer play];
}

The code above checks if the instance of backgroundMusicPlayer has been initialized. If not, it initializes it with some basic settings, like the number of loops, the volume and the URL to play, which is passed into the method as a parameter.

Note: AVAudioPlayer isn’t specific to Sprite Kit, so this tutorial won’t cover it in detail. To learn more about AVAudioPlayer, check out our Audio Tutorial for iOS.

Once the method has initialized the music player, it checks if the music player is already playing, and turns it on if it’s not.

You need this check so that when the scene reloads after the player scores a point or the prize hits the ground, the music won’t “skip” or “restart.” Is this necessary? No. Does it sound better? Absolutely.

Locate setupSounds and add the following line:

[self playBackgroundMusic:kSoundFileNameForBackgroundMusic];

That line makes a call to the method you just wrote. By the way, did you catch that constant you’re using? If you did, you score two extra points. You defined the constant kSoundFileNameForBackgroundMusic in TLCSharedConstants.m earlier.

You may as well add sound effects while you’re at it!

For the last time, locate the @interface section of TLCMyScene.m and add the following properties:

@property (nonatomic, strong) SKAction *soundCutAction;
@property (nonatomic, strong) SKAction *soundSplashAction;
 
@property (nonatomic, strong) SKAction *soundNomNomNomAction;

Next, locate setupSounds. Just above the last line, add the code below:

self.soundCutAction = [SKAction playSoundFileNamed:kSoundFileNameForCutAction waitForCompletion:NO];
self.soundSplashAction = [SKAction playSoundFileNamed:kSoundFileNameForSplashAction waitForCompletion:NO];
self.soundNomNomNomAction = [SKAction playSoundFileNamed:kSoundFileNameForBiteAction waitForCompletion:NO];

This code initializes the variables using SKAction’s playSoundFileNamed:waitForCompletion: method.

In TLCMyScene.m, find checkForGroundHit and add the following line of code just above SKAction *shrink = [SKAction scaleTo:0 duration:0.08]; line:

[self runAction:self.soundSplashAction];

Find checkForScore and add the following line of code just above SKAction *shrink = [SKAction scaleTo:0 duration:0.08];:

[self runAction:self.soundNomNomNomAction];

Find checkRopeCutWithBody: and add the following line of code just above [self nomnomnomActionWithDelay:1]; line:

[self runAction:self.soundCutAction];

Finally, locate initWithSize: and add the following line before [self setupBackground]; line:

[self setupSounds];

Build and run the project.

ec_cut_the-rope_build_04b

The app should be popping now, yet the discerning player may notice a slight sound bug. In some instances, you may hear both the nom-nom sound but also, the splashing sound. This is due to the prize triggering multiple collisions before it is removed from the scene. To fix this, add a new property in the interface section:

@property (nonatomic, assign) BOOL roundOutcome;

Next, add the following code to both checkForScore and checkForGroundHit at the top of each if block.

self.roundOutcome = YES;

Finally, replace the contents of update: with the following::

if (!self.roundOutcome) {
    [self checkForScore];
    [self checkForGroundHit];
    [self checkForPrize];
}

By containing all of the checks in a block, you insure that the methods will not be called once an outcome has occurred. Build and run and swipe away. There’s no sound collisions and you will have one very stuffed crocodile :]

Nomnomnomnom

Where to Go From Here?

I hope you enjoyed working through this tutorial as much as I’ve enjoyed writing it. To compare notes, download the CutTheVerlet-Finished completed sample project here.

But, don’t let the fun stop here! Try adding new levels, different ropes, and maybe even a HUD with a score display and timer. Why not!? It’s only code!

If you’d like to learn more about Sprite Kit, be sure to check out our book, iOS Games by Tutorials.

If you have any questions or comments, feel free to join in the discussion below!

How to Create a Game Like Cut the Rope Using Sprite Kit is a post from: Ray Wenderlich

The post How to Create a Game Like Cut the Rope Using Sprite Kit appeared first on Ray Wenderlich.

RWDevCon – Open Call for Suggestions

$
0
0
Help us pick topics for RWDevCon!

Help us pick topics for RWDevCon!

Recently, we put a poll on our sidebar asking if you would be interested in an official raywenderlich.com conference, focused on high quality Swift/iOS 8 technical content, and connecting as a community.

So far, over 1,000 people said they’d be interested, so we’re going to proceed with this!

The conference will run February next year, and we are currently in the process of finalizing the dates and location – I’ll post here as soon as we have the final details.

But in the meantime, we’d love to hear what you’d like to see in the conference!

Our Current Thoughts

The conference will be a 2-day conference (Fri+Sat) sometime in Feb, somewhere in Washington DC. Here’s our current thoughts:

  • An amazing lineup of speakers. We’ll have tons of amazing speakers who literally wrote the book on their subjects, including Sam Davies, Matthijs Hollemans, Matt Galloway, Mic Pringle, Marin Todorov, and much more!
  • Coordinated iOS 8 and Swift content. I will be personally organizing and curating the content of the conference to make sure it’s top-notch. We’ll have 2-3 tracks of content fully up-to-date with iOS 8 and Swift – trust me, you will learn a ton!
  • Interactive focus. We believe strongly in a hands-on focus, so one of the tracks will purely focused on interactive content: workshops, labs, panels, group discussions, stump the experts, lightning talks, etc.
  • Connect as a community One of our main goals of this conference is to connect as a team and community – myself and the other authors can’t wait to hang out with you! Expect some fun parties and social events as well :]
  • International focus. Tons of our speakers are coming from abroad and we know many of you are too – so we are right in downtown Washington DC, for an easy trip (plus you can extend your trip for sightseeing)!
  • Organized by the best. We are teaming up with John Wilker from 360iDev to organize the logistics-side of the conference and make sure it’s an amazing experience for all.
  • Non-profit. RWDevCon is totally non-profit – we’re not looking to make any money on this, and will be spending every cent on making it a nice experience for attendees and speakers.

So anyway, these are our current thoughts, but what we are most interested in is the following:

Open Call for Suggestions

Like I said, we have some amazing speakers lined up, and I am personally going to curate the content of this conference, so we have a great opportunity to make this into a “dream conference” for iOS developers.

We’d love to hear your suggestions! Here are some questions for you:

  • What excites you the most about this conference?
  • What kinds of talks would you most like to attend?
  • Are there any particular subjects, APIs, or technologies that you’d particularly like to see covered?
  • Do you have any other suggestions that would make this an amazing conference for you?

Please leave your comments in the forums below or feel free to email me directly at ray@raywenderlich.com.

If you want to get notified when tickets go on sale, sign up here:


Thanks all and we hope to see you at RWDevCon! :]

RWDevCon – Open Call for Suggestions is a post from: Ray Wenderlich

The post RWDevCon – Open Call for Suggestions appeared first on Ray Wenderlich.

Video Tutorial: Saving Data in iOS Part 11: Core Data NSFetchedResultsController

Video Tutorial: Saving Data in iOS Part 9: Introduction to FMDB

Call for Applicants: Authors, Tech Editors, and Coders!

$
0
0
Apply to join or team today!

Apply to join or team today!

As you know, we are always trying to improve our written tutorials, video tutorials, and books on this site.

What we need most is a fresh batch of folks on the team, to contribute your passion, ideas and experience.

So today, I am pleased to announce we are having an open call for applicants for three different teams at raywenderlich.com (click links to jump to that section):

  • The Tutorial Team: For the first time since 2011, we are having a public call for applicants for the Tutorial Team! On this team, your job would be to write the highest quality tutorials on the net – whether in written tutorials, video tutorials, or books.
  • The Tech Editing Team: We are also looking for tech editors to join the editing team. On this team, your job would be to make our tutorials shine from a technical perspective.
  • The Code Team: This is a brand new team that we are actively recruiting for for the first time ever! On this team, your job would be to develop sample projects demonstrating cool/advanced techniques that the we could then “tutorialize.”

For these teams, we are looking for advanced-level developers only. However, your experience could be in a variety of categories (not just iOS):

  • iOS App Development
  • iOS Game Development (with Sprite Kit or Scene Kit)
  • Unity Game Development
  • Android App Development
  • OS X App Development

Are you an advanced-level developer interested in joining one of these teams? Keep reading for more details on what’s involved, the benefits of joining, and how to apply.

 

The Tutorial Team

The Tutorial Team is an elite group of app developers and writers who are joining together to make the highest quality developer tutorials available in the world.

By writing tutorials for this site, you can make a huge positive difference in the developer community. Your tutorials will be widely read, and will help a ton of developers learn and grow. You may even help some developers start their careers making apps or games – making dreams come true!

And through the hard work it takes to write these tutorials and the detailed feedback from the editing team, you will become a much better developer and writer yourself.

Benefits

As a part of the raywenderlich.com Tutorial Team, you’ll receive the following benefits (in addition to learning & helping others, of course):

  • Eyeballs. This site gets a lot of traffic – over 2.5 million pageviews per month and growing! When you publish a tutorial here, it will get read a lot, and people will love you for it. They will tweet your post, comment it on the forums, and give you feedback. You can be sure that a lot of people will notice and enjoy your hard work.
  • Tutorial Polish. When you submit a tutorial to raywenderlich.com, we will personally work with you to polish your tutorials to a high level of quality. Your tutorial will go through three edit passes: a tech edit, an edit, and a final pass edit. In the end, your tutorial will look much better than when you first submitted it, making you look really good. :]
  • Writing Training. When we are done editing your tutorial, we will send you detailed feedback on how you can improve your tutorials in the future. This will help make you a better developer and writer.
  • Personal Exposure. At the end of any tutorial you write, you can include your picture and a link to any site you would like for exposure (example). In addition, your will be featured in the about page, the raywenderlich.com Team Twitter list, and even the scrolling list of team members on the front page.
  • Money! The first tutorial you write to join the Tutorial Team is not paid since we give you a ton of free stuff instead (see below), but after that you will be paid for your tutorials as long as you stay on the tutorial team. We offer the highest rates in the industry.
  • Special Opportunities. Members of the Tutorial Team get access to opportunities not available to anyone else, such as contributing to our books and products, writing starter kits, working on team projects, and much more.
  • Contracting Opportunities. Members of the Tutorial Team share contracting opportunities we hear about to the rest of the team – a great way to find out about more work.
  • Free Stuff! And as a final bonus, by joining the Tutorial Team you will get a lot of free stuff. You’ll get a free copy of all of the products we sell on the site – over $500 in value total.

This is an informal, part-time position – you’d be writing about 3 tutorials per year. We do expect that when you are assigned a tutorial to write, that you complete the tutorial within 1 month.

Requirements and How to Apply

Here are the requirements:

  • You must be an advanced-level developer.
  • You should be comfortable learning brand new topics that you have never done before, which are either not documented or poorly documented.
  • You should be comfortable figuring out how to reproduce complex techniques from other apps with no guidance or help. For example, if you are an app developer you should be capable of reproducing some of these animations, and if you are a game developer you should be capable of reproducing a game like this.
  • You should be a great writer with fluent English writing skills.

To apply, simply send me an email with the following details:

  • Why do you want to join the Tutorial Team?
  • Please tell me a little bit about yourself and your experience.
  • What is the best app you’ve made or worked on, and why are you proud of the work you did on this app? [Please include an App Store link]
  • Please link to any examples of technical writing you have done in the past.
  • Please include links to: your GitHub account, your StackOverflow account, your Twitter account.

I will be selecting a few of the top applicants to tryout for the team by writing their first tutorial. If your tutorial is accepted, you’re in.

 

The Tech Editing Team

Have you ever found a bug, grammar mistake, or technical mistake in one of our tutorials? Well, technical editing might be for you. :]

Our Tech Editors are some of our most experienced developers. We have particularly high standards for what we look for in tech editors and a grueling tryout process.

This is for good reason. As a tech editor, we look to you to “level-up” each tutorial you get your hands on by adding your technical expertise, and make each tutorial as polished as possible.

By improving our tutorials, you make a huge difference in the iOS community by making sure everyone is learning the right stuff. It also really helps our authors learn and improve, and you’ll learn a ton along the way as well – while getting paid. :]

Note: We actually just had a call for tech editors just a few months ago. But now that we’ve started a brand new Update Team our tech editors are getting stretched a bit thin, so we could use a few more. If you applied before but didn’t hear back, feel free to try again.

Benefits

There are many great reasons to be a technical editor for raywenderlich.com:

  • You’ll become a better developer. Being a technical editor forces you to carve out learning time in your schedule, which can be difficult to do otherwise, due to the demands of a day job.
  • You’ll become a better writer. By reviewing tutorials, you will also learn a lot about writing. Both from helping other authors improve, and from learning from other fellow editors.
  • You’ll get a lot of exposure. You’ll get to know each member of the Tutorial Team, their experience level and writing style, and work with me and the other editors more closely than just about anyone else. These contacts may prove to be invaluable, and will likely lead to some special opportunities.
  • You’ll get paid! We also pay for each tech edit performed, and you get a ton of free stuff for joining too. So basically, you’re getting paid to learn. :]

This is an informal, part-time position – you’d be editing about 1-3 tutorials per month. We do expect that when you are assigned a tutorial to tech edit, that you complete the tech edit within 1 week.

Requirements and How to Apply

Here are the requirements:

  • You must be an advanced-level developer. You are probably a team lead at your full time job.
  • [If you're an iOS developer] You must have been digging into Swift already. You probably can get a good score on our Swift Ninja programming challenge.
  • You must read a lot of technical books, blogs, and/or podcasts and be up-to-speed with the latest news and techniques.
  • You have a very detail-oriented/pedantic personality.
  • You should be a great writer with fluent English writing skills.

To apply, simply send me an email with the following details:

  • Why do you want to join the Tech Editing Team?
  • Please tell me a little bit about yourself and your experience.
  • What is the best app you’ve made or worked on, and why are you proud of the work you did on this app? [Please include an App Store link]
  • What technical blogs and podcasts do you follow on a regular basis?
  • Please link to any examples of technical writing you have done in the past.
  • Please include links to: your GitHub account, your StackOverflow account, your Twitter account.

I will be selecting a few of the top applicants to tryout for the team by going through a multi-phase tryout process – I will send you more details if you’re selected.

 

The Code Team

The Code Team is for developers who are awesome at writing code, but who are not interested (or maybe not good) in writing a tutorial about their code.

Your job is to write cool advanced level sample projects demonstrating neat techniques that other developers would be interested in learning about. For example, these are the kinds of projects we’d be looking for:

  • Reproducing the “book open” animation in Paper by FiftyThree
  • Reproducing some of these popular iOS animations
  • Reproducing other popular UI techniques like Sam Page has done on subjc.com
  • Reproducing 2D lighting in Unity like in this post
  • Prototyping water simulation in Sprite Kit
  • Investigating popular open-source Auto Layout libraries and writing a cool project demonstrating usage of each
  • Making animated 3D bar charts and line graphs with Scene Kit
  • …and much more (we want you to surprise and delight us)!

If some of these projects sound a fun challenge to you and something you’d definitely be capable of, you might be a good match for the Code Team. :]

Benefits

As a part of the raywenderlich.com Code Team, you’ll receive the following benefits:

  • It’s a fun learning experience. The kind of person we’re looking for on the Code Team is the kind of person who loves figuring out cool things like this for fun and learning. This is a perfect chance to get some fun learning challenges that also have a practical application.
  • Make a tutorial happen – without having to write it! If you wished you could make a tutorial to contribute to the community without having to do the writing part – now you can. Just make an awesome project and hand it over to us – we’ll take it form there.
  • Mad bragging rights. Whenever we write a tutorial using your technique, we’ll give you full credit as the brains behind it – check this post by the always-epic Orta for an example :]
  • You’ll get paid! We also will pay you a set fee for each project, and you get a ton of free stuff for joining too. So basically, you’re getting paid to learn. :]

This is an informal, part-time position – you’d be writing about 3 sample projects per year. We do expect that when you are assigned a sample project to write, that you complete the sample project within 1 month.

Requirements and How to Apply

Here are the requirements:

  • You must be an advanced-level developer.
  • You should be comfortable learning brand new and difficult topics that you have never done before, such as some of the earlier examples, which are either not documented or poorly documented.

To apply, just send me an email with the following details:

  • Why do you want to join the Code Team?
  • A link to your GitHub account

I will be selecting a few of the top applicants to tryout for the team by writing your first sample project. If your sample project is accepted, you’re in.

Where To Go From Here?

Thanks so much for your consideration in joining one of our teams!

Please note that we usually get hundreds of emails when we do a public call for applicants, so please understand we may not have time to respond to everyone. We do promise to read each and every email though.

We can’t wait to welcome some of you to our team, and look forward to hanging out with you and getting to know you.

If you have any questions or comments, please join the forum discussion below.

Call for Applicants: Authors, Tech Editors, and Coders! is a post from: Ray Wenderlich

The post Call for Applicants: Authors, Tech Editors, and Coders! appeared first on Ray Wenderlich.


OpenGL ES Pixel Shaders Tutorial

$
0
0
OpenGL ES Pixel Shaders

Bark at the Moon.fsh!

In this pixel shaders tutorial, you’ll learn how to turn your iPhone into a full-screen GPU canvas.

What this means is that you’ll make a low-level, graphics-intensive app that will paint every pixel on your screen individually by combining interesting math equations.

But why? Well, besides being the absolute coolest things in computer graphics, pixel shaders can be very useful in:

Note: The demos linked above use WebGL, which is only fully supported on Chrome and Opera, at least at the time of writing this tutorial. These demos are also pretty intense – so try to have them not running on multiple tabs simultaneously.

The shaders you’ll write are not as complex as the ones above, but you’ll get a lot more out of these exercises if you’re familiar with OpenGL ES. If you’re new to the API, then please check out some of our written or video tutorials on the subject first :]

Without further ado, it is my pleasure to get you started with pixel shaders in iOS!

Note: The term “graphics-intensive” is no joke in this tutorial. This app will safely push your iPhone’s GPU to its limit, so use an iPhone 5 or newer version. If you don’t have an iPhone 5 or later, the iOS simulator will work just fine.

Getting Started

First, download the starter pack for this tutorial. Have a look at RWTViewController.m to see the very light GLKViewController implementation, and then build and run. You should see the screen below:

s_Run1

Nothing too fancy just yet, but I’m sure Green Man would approve :]

For the duration of this tutorial, a full green screen means your base shaders (RWTBase.vsh and RWTBase.fsh) are in working order and your OpenGL ES code is set up properly. Throughout this tutorial, green means “Go” and red means “Stop”.

If at any point you find yourself staring at a full red screen, you should “Stop” and verify your implementation, because your shaders failed to compile and link properly. This works because the viewDidLoad method in RWTViewController sets glClearColor() to red.

A quick look at RWTBase.vsh reveals one of the simplest vertex shaders you’ll ever encounter. All it does is calculate a point on the x-y plane, defined by aPosition.

The vertex attribute array for aPosition is a quad anchored to each corner of the screen (in OpenGL ES coordinates), named RWTBaseShaderQuad in RWTBaseShader.m. RWTBase.fsh is an even more simple fragment shader that colors all fragments green, regardless of position. This explains your bright green screen!

Now, to break this down a bit further…

Pixel Shaders vs Vertex/Fragment Shaders

If you’ve taken some of our previous OpenGL ES tutorials, you may have noticed that we talk about vertex shaders for manipulating vertices and fragment shaders for manipulating fragments. Essentially, a vertex shader draws objects and a fragment shader colors them. Fragments may or may not produce pixels depending on factors such as depth, alpha and viewport coordinates.

So, what happens if you render a quad defined by four vertices as shown below?

g_Quad

Assuming you haven’t enabled alpha blending or depth testing, you get an opaque, full-screen cartesian plane.

Under these conditions, after the primitive rasterizes, it stands to reason that each fragment corresponds to exactly one pixel of the screen – no more, no less. Therefore, the fragment shader will color every screen pixel directly, thus earning itself the name of pixel shader :O

Note: By default, GL_BLEND and GL_DEPTH_TEST are disabled. You can see a list of glEnable() and glDisable() capabilities here, and you can query them programmatically using the function glIsEnabled().

Pixel Shaders 101: Gradients

Your first pixel shader will be a gentle lesson in computing linear gradients.

Note: In order to conserve space and focus on the algorithms/equations presented in this tutorial, the global GLSL precision value for floats is defined as highp.

The official OpenGL ES Programming Guide for iOS has a small section dedicated to precision hints which you can refer to afterwards for optimization purposes, along with the iOS Device Compatibility Reference.

Remember, for a full-screen iPhone 5, each fragment shader gets called 727,040 times per frame! (640*1136)

The magic behind pixel shaders lies within gl_FragCoord. This fragment-exclusive variable contains the window-relative coordinates of the current fragment.

For a normal fragment shader, “this value is the result of fixed functionality that interpolates primitives after vertex processing to generate fragments”. For pixel shaders, however, just know the xy swizzle value of this variable maps exactly to one unique pixel on the screen.

Open RWTGradient.fsh and add the following lines just below precision:

// Uniforms
uniform vec2 uResolution;

uResolution comes from the rect variable of glkView:drawInRect: within RWTViewController.m (i.e. the rectangle containing your view).

uResolution in RWTBaseShader.m handles the width and height of rect and assigns them to the corresponding GLSL uniform in the method renderInRect:atTime:. All this means is that uResolution contains the x-y resolution of your screen.

Many times you’ll greatly simplify pixel shader equations by converting pixel coordinates to the range 0.0 ≤ xy ≤ 1.0, achieved by dividing gl_FragCoord.xy/uResolution. This is a perfect range for gl_FragColor too, so let’s see some gradients!

Add the following lines to RWTGradient.fsh inside main(void):

vec2 position = gl_FragCoord.xy/uResolution;
float gradient = position.x;
gl_FragColor = vec4(0., gradient, 0., 1.);

Next, change your program’s fragment shader source from RWTBase to RWTGradient in RWTViewController.m by changing the following line:

self.shader = [[RWTBaseShader alloc] initWithVertexShader:@"RWTBase" fragmentShader:@"RWTBase"];

to:

self.shader = [[RWTBaseShader alloc] initWithVertexShader:@"RWTBase" fragmentShader:@"RWTGradient"];

Build and run! Your screen should show a really nice black->green gradient from left->right

s_Run2

Pretty cool, eh? To get the same gradient from bottom->top, change the following line in RWTGradient.fsh:

float gradient = position.x;

to:

float gradient = position.y;

Build and run again to see your gradient’s new direction…

s_Run3

Now it’s time for a challenge! See if you can reproduce the screenshot below by just changing one line of code in your shader.

s_Run4

Hint: Remember that position ranges from 0.0 to 1.0 and so does gl_FragColor.

Solution Inside: Diagonal Gradient SelectShow>

Well done if you figured it out! If you didn’t, just take a moment to review this section again before moving on. :]

Pixel Shader Geometry

In this section, you’ll learn how to use math to draw simple shapes, starting with a 2D disc/circle and finishing with a 3D sphere.

Geometry: 2D Disc

Open RWTSphere.fsh and add the following lines just below precision:

// Uniforms
uniform vec2 uResolution;

This is the same uniform encountered in the previous section and it’s all you’ll need to generate static geometry. To create a disc, add the following lines inside main(void):

// 1
vec2 center = vec2(uResolution.x/2., uResolution.y/2.);
 
// 2
float radius = uResolution.x/2.;
 
// 3
vec2 position = gl_FragCoord.xy - center;
 
// 4
if (length(position) > radius) {
  gl_FragColor = vec4(vec3(0.), 1.);
} else {
  gl_FragColor = vec4(vec3(1.), 1.);
}

There’s a bit of math here and here are the explanations of what’s happening:

  1. The center of your disc will be located exactly in the center of your screen.
  2. The radius of your disc will be half the width of your screen.
  3. position is defined by the coordinates of the current pixel, offset by the disc center. Think of it as a vector pointing from the center of the disk to the position.
  4. length() calculates the length of a vector, which in this case is defined by the Pythagorean Theorem √(position.x²+position.y²).
    1. If the resulting value is greater than radius, then that particular pixel lies outside the disc area and you color it black.
    2. Otherwise, that particular pixel lies within the disc and you color it white.

For an explanation of this behavior, look to the circle equation defined as: (x-a)²+(y-b)² = r². Note that r is the radius, ab is the center and xy is the set of all points on the circle.

Since a disc is the region in a plane bounded by a circle, the if-else statement will accurately draw a disc in space!

Before you build and run, change your program’s fragment shader source to RWTSphere in RWTViewController.m:

self.shader = [[RWTBaseShader alloc] initWithVertexShader:@"RWTBase" fragmentShader:@"RWTSphere"];

Now, build and run. Your screen should show a solid white disc with a black background. No, it’s not the most innovative design, but you have to start somewhere.

s_Run5

Feel free to play around with some of the disc’s properties and see how modifications affect your rendering. For an added challenge, see if you can make the circle shape shown below:

s_Run6

Hint: Try creating a new variable called thickness defined by your radius and used in your if-else conditional.

Solution Inside: Skinny Circle SelectShow>

If you attempted the challenge or modified your GLSL code, please revert back to that basic solid white disc for now (Kudos for your curiosity though!).

Replace your if-else conditional with the following:

if (length(position) > radius) {
  discard;
}
 
gl_FragColor = vec4(vec3(1.), 1.);

Dear reader, please let me introduce you to discard. discard is a fragment-exclusive keyword that effectively tells OpenGL ES to discard the current fragment and ignore it in the following stages of the rendering pipeline. Build and run to see the screen below:

s_Run7

In pixel shader terminology, discard returns an empty pixel that isn’t written to the screen. Therefore, glClearColor() determines the actual screen pixel in its place.

From this point on, when you see a bright red pixel, it means discard is working properly. But you should still be wary of a full red screen, as it means something in the code is not right.

Geometry: 3D Sphere

Now it’s time to put a new spin on things and convert that drab 2D disc to a 3D sphere, and to do that you need to account for depth.

In a typical vertex+fragment shader program, this would be simple. The vertex shader could handle 3D geometry input and pass along any information necessary to the fragment shader. However, when working with pixel shaders you only have a 2D plane on which to “paint”, so you’ll need to fake depth by inferring z values.

Several paragraphs ago you created a disc by coloring any pixels inside a circle defined by:

(x-a)²+(y-b)² = r²

Extending this to the sphere equation is very easy, like so:

(x-a)²+(y-b)²+(z-c)² = r²

c is the z center of the sphere. Since the circle center ab offsets your 2D coordinatesand your new sphere will lie on the z origin, this equation can be simplified to:

x²+y²+z² = r²

Solving for z results in the equation:


z² = √(r²-x²-y²)

And that’s how you can infer a z value for all fragments, based on their unique position! Luckily enough, this is very easy to code in GLSL. Add the following lines to RWTSphere.fsh just before gl_FragColor:

float z = sqrt(radius*radius - position.x*position.x - position.y*position.y);
z /= radius;

The first line calculates z as per your reduced equation, and the second divides by the sphere radius to contain the range between 0.0 and 1.0.

In order to visualize your sphere’s depth, replace your current gl_FragColor line with the following:

gl_FragColor = vec4(vec3(z), 1.);

Build and run to see your flat disc now has a third dimension.

s_Run8

Since positive z-values are directed outwards from the screen towards the viewer, the closest points on the sphere are white (middle) while the furthest points are black (edges).

Naturally, any points in between are part of a smooth, gray gradient. This piece of code is a quick and easy way to visualize depth, but it ignores the xy values of the sphere. If this shape were to rotate or sit alongside other objects, you couldn’t tell which way is up/down or left/right.

Replace the line:

z /= radius;

With:

vec3 normal = normalize(vec3(position.x, position.y, z));

A better way to visualize orientation in 3D space is with the use of normals. In this example, normals are vectors perpendicular to the surface of your sphere. For any given point, a normal defines the direction that point faces.

In the case of this sphere, calculating the normal for each point is easy. We already have a vector (position) that points from the center of the sphere to the current point, as well as its z value. This vector doubles as the direction the point is facing, or the normal.

If you’ve worked through some of our previous OpenGL ES tutorials, you know that it’s also generally a good idea to normalize() vectors, in order to simplify future calculations (particularly for lighting).

Normalized normals lie within the range -1.0 ≤ n ≤ 1.0, while pixel color channels lie within the range 0.0 ≤ c ≤ 1.0. In order to visualize your sphere’s normals properly, define a normal n to color c conversion like so:

-1.0 ≤ n ≤ 1.0
(-1.0+1.0)(n+1.0)(1.0+1.0)
0.0(n+1.0)2.0
0.0/2.0(n+1.0)/2.02.0/2.0
0.0(n+1.0)/2.01.0
0.0 ≤ c ≤ 1.0
c = (n+1.0)/2.0

Voilà! It’s just that simple
Now, replace the line:

gl_FragColor = vec4(vec3(z), 1.);

With:

gl_FragColor = vec4((normal+1.)/2., 1.);

Then build and run. Prepare to feast your eyes on the round rainbow below:

s_Run9

This might seem confusing at first, particularly when your previous sphere rendered so smoothly, but there is a lot of valuable information hidden within these colors…

What you’re seeing now is essentially a normal map of your sphere. In a normal map, rgb colors represent surface normals which correspond to actual xyz coordinates, respectively. Take a look at the following diagram:

g_NormalsColors

The rgb color values for the circled points are:

p0c = (0.50, 0.50, 1.00)
p1c = (0.50, 1.00, 0.53)
p2c = (1.00, 0.50, 0.53)
p3c = (0.50, 0.00, 0.53)
p4c = (0.00, 0.50, 0.53)

Previously, you calculated a normal n to color c conversion. Using the reverse equation, n = (c*2.0)-1.0, these colors can be mapped to specific normals:

p0n = (0.00, 0.00, 1.00)
p1n = (0.00, 1.00, 0.06)
p2n = (1.00, 0.00, 0.06)
p3n = (0.00, -1.00, 0.06)
p4n = (-1.00, 0.00, 0.06)

Which, when represented with arrows, look a bit like this:

g_NormalsCoordinates

Now, there should be absolutely no ambiguity for the orientation of your sphere in 3D space. Furthermore, you can now light your object properly!

Add the following lines above main(void) in RWTSphere.fsh:

// Constants
const vec3 cLight = normalize(vec3(.5, .5, 1.));

This constant defines the orientation of a virtual light source that illuminates your sphere. In this case, the light gleams towards the screen from the top-right corner.

Next, replace the following line:

gl_FragColor = vec4((normal+1.)/2., 1.);

With:

float diffuse = max(0., dot(normal, cLight));
 
gl_FragColor = vec4(vec3(diffuse), 1.);

You may recognize this as the simplified diffuse component of the Phong reflection model. Build and run to see your nicely-lit sphere!

s_Run10

Note: To learn more about the Phong lighting model, check out our Ambient, Diffuse, and Specular video tutorials [Subscribers Only].

3D objects on a 2D canvas? Just using math? Pixel-by-pixel? WHOA

This is a great time for a little break so you can bask in the soft, even glow of your shader in all of its glory…and also clear your head a bit because, dear reader, you’ve only just begun.

Pixel Shader Procedural Textures: Perlin Noise

In this section, you’ll learn all about texture primitives, pseudorandom number generators, and time-based functions – eventually working your way up to a basic noise shader inspired by Perlin noise.

The math behind Perlin Noise is a bit too dense for this tutorial, and a full implementation is actually too complex to run at 30 FPS.

The basic shader here, however, will still cover a lot of noise essentials (with particular thanks to the modular explanations/examples of Hugo Elias and Toby Schachman).

Ken Perlin developed Perlin noise in 1981 for the movie TRON, and it’s one of the most groundbreaking, fundamental algorithms in computer graphics.

It can mimic pseudorandom patterns in natural elements, such as clouds and flames. It is so ubiquitous in modern CGI that Ken Perlin eventually received an Academy Award in Technical Achievement for this technique and its contributions to the film industry.

The award itself explains the gist of Perlin Noise quite nicely:

“To Ken Perlin for the development of Perlin Noise, a technique used to produce natural appearing textures on computer generated surfaces for motion picture visual effects. The development of Perlin Noise has allowed computer graphics artists to better represent the complexity of natural phenomena in visual effects for the motion picture industry.”

Clouds: Noise translates in x and z

Flames: Noise scales in x,y, translates in z

So yeah, it’s kind of a big deal… and you’ll get to implement it from the ground up.

But first, you must familiarize yourself with time inputs and math functions.

Procedural Textures: Time

Open RWTNoise.fsh and add the following lines just below precision highp float;

// Uniforms
uniform vec2 uResolution;
uniform float uTime;

You’re already familiar with the uResolution uniform, but uTime is a new one. uTime comes from the timeSinceFirstResume property of your GLKViewController subclass, implemented as RWTViewController.m (i.e. time elapsed since the first time the view controller resumed update events).

uTime handles this time interval in RWTBaseShader.m and is assigned to the corresponding GLSL uniform in the method renderInRect:atTime:, meaning that uTime contains the elapsed time of your app, in seconds.

To see uTime in action, add the following lines to RWTNoise.fsh, inside main(void):

float t = uTime/2.;
if (t>1.) {
  t -= floor(t);
}
 
gl_FragColor = vec4(vec3(t), 1.);

This simple algorithm will cause your screen to repeatedly fade-in from black to white.

The variable t is half the elapsed time and needs converting to fit in between the color range 0.0 to 1.0. The function floor() accomplishes this by returning the nearest integer less than or equal to t, which you then subtract from itself.

For example, for uTime = 5.50: at t = 0.75, your screen will be 75% white.

t = 2.75
floor(t) = 2.00
t = t - floor(t) = 0.75

Before you build and run, remember to change your program’s fragment shader source to RWTNoise in RWTViewController.m:

self.shader = [[RWTBaseShader alloc] initWithVertexShader:@"RWTBase" fragmentShader:@"RWTNoise"];

Now build and run to see your simple animation!

You can reduce the complexity of your implementation by replacing your if statement with the following line:

t = fract(t);

fract() returns a fractional value for t, calculated as t - floor(t). Ahhh, there, that’s much better
Now that you have a simple animation working, it’s time to make some noise (Perlin noise, that is).

Procedural Textures: “Random” Noise

fract() is an essential function in fragment shader programming. It keeps all values within 0.0 and 1.0, and you’ll be using it to create a pseudorandom number generator (PRNG) that will approximate a white noise image.

Since Perlin noise models natural phenomena (e.g. wood, marble), PRNG values work perfectly because they are random-enough to seem natural, but are actually backed by a mathematical function that will produce subtle patterns (e.g. the same seed input will produce the same noise output, every time).

Controlled chaos is the essence of procedural texture primitives!



Note: Computer randomness is a deeply fascinating subject that could easily span dozens of tutorials and extended forum discussions. arc4random() in Objective-C is a luxury for iOS developers. You can learn more about it from NSHipster, a.k.a. Mattt Thompson. As he so elegantly puts it, “What passes for randomness is merely a hidden chain of causality”.

The PRNG you’ll be writing will be largely based on sine waves, since sine waves are cyclical which is great for time-based inputs. Sine waves are also straightforward as it’s just a matter of calling sin().

They are also easy to dissect. Most other GLSL PRNGs are either great, but incredibly complex, or simple, but unreliable.

But first, a quick visual recap of sine waves:

g_SineWaveDiagram

You may already be familiar with the amplitude A and wavelength λ. However, if you’re not, don’t worry too much about them; after all, the goal is to create random noise, not smooth waves.

For a standard sine wave, peak-to-peak amplitude ranges from -1.0 to 1.0 and wavelength is equal to (frequency = 1).

In the image above, you are viewing the sine wave from the “front”, but if you view it from the “top” you can use the waves crests and troughs to draw a smooth greyscale gradient, where crest = white and trough = black.

Open RWTNoise.fsh and replace the contents of main(void) with the following:

vec2 position = gl_FragCoord.xy/uResolution.xy;
 
float pi = 3.14159265359;
float wave = sin(2.*pi*position.x);
wave = (wave+1.)/2.;
 
gl_FragColor = vec4(vec3(wave), 1.);

Remember that sin(2π) = 0, so you are multiplying by the fraction along the x-axis for the current pixel. This way, the far left side of the screen will be the left side of the sin wave, and the far right side of the screen will be the right side of the sin wave.

Also remember the output of sin is between -1 and 1, so you add 1 to the result and divide it by 2 to get the output in the range of 0 to 1.

Build and run. You should see a smooth sine wave gradient with one crest and one trough.

s_Run11

Transferring the current gradient to the previous diagram would look something like this:

g_SineWaveGradient

Now, make that wavelength shorter by increasing its frequency and factoring in the y-axis of the screen.

Change your wave calculation to:

float wave = sin(4.*2.*pi*(position.x+position.y));

Build and run. You should see that your new wave not only runs diagonally across the screen, but also has way more crests and troughs (the new frequency is 4).

s_Run12

So far the equations in your shader have produced neat, predictable results and formed orderly waves. But the goal is entropy, not order, so now it’s time to start breaking things a bit. Of course, this is a calm, controlled kind of breaking, not a bull-in-a-china-shop kind of breaking.

Replace the following lines:

float wave = sin(4.*2.*pi*(position.x+position.y));
wave = (wave+1.)/2.;

With:

float wave = fract(sin(16.*2.*pi*(position.x+position.y)));

Build and run. What you’ve done here is increase the frequency of the waves and use fract() to introduce harder edges in your gradient. You’re also no longer performing a proper conversion between different ranges, which adds a bit of spice in the form of chaos.

s_Run13a

The pattern generated by your shader is still fairly predictable, so go ahead and throw another wrench in the gears.

Change your wave calculation to:

float wave = fract(10000.*sin(16.*(position.x+position.y)));

Now build and run to see a salt & pepper spill.


s_Run14


The 10000 multiplier is great for generating pseudorandom values and can be quickly applied to sine waves using the following table:

Angle sin(a)
1.0   .0174
2.0   .0349
3.0   .0523
4.0   .0698
5.0   .0872
6.0   .1045
7.0   .1219
8.0   .1392
9.0   .1564
10.0  .1736

Observe the sequence of numbers for the second decimal place:

1, 3, 5, 6, 8, 0, 2, 3, 5, 7

Now observe the sequence of numbers for the fourth decimal place:

4, 9, 3, 8, 2, 5, 9, 2, 4, 6

A pattern is more apparent in the first sequence, but less so in the second. While this may not always be the case, less significant decimal places are a good starting place for mining pseudorandom numbers.

It also helps that really large numbers may have unintentional precision loss/overflow errors.

At the moment, you can probably still see a glimpse of a wave imprinted diagonally on the screen. If not, it might be time to pay a visit to your optometrist. ;]

The faint wave is simply a product of your calculation giving equal importance to position.x and position.y values. Adding a unique multiplier to each axis will dissipate the diagonal print, like so:

float wave = fract(10000.*sin(128.*position.x+1024.*position.y));

s_Run15

Time for a little clean up! Add the following function, randomNoise(vec2 p), above main(void):

float randomNoise(vec2 p) {
  return fract(6791.*sin(47.*p.x+p.y*9973.));
}

The most random part about this PRNG is your choice of multipliers.

I chose the ones above from a list of prime numbers and you can use it too. If you select your own numbers, I would recommend a small value for p.x, and larger ones for p.y and sin().

Next, refactor your shader to use your new randomNoise function by replacing the contents of main(void) with the following:

vec2 position = gl_FragCoord.xy/uResolution.xy;
float n = randomNoise(position);
gl_FragColor = vec4(vec3(n), 1.);

Presto! You now have a simple sin-based PRNG for creating 2D noise. Build and run, then take a break to celebrate, you’ve earned it.

s_Run16

Procedural Textures: Square Grid

When working with a 3D sphere, normalizing vectors makes equations much simpler, and the same is true for procedural textures, particularly noise. Functions like smoothing and interpolation are a lot easier if they happen on a square grid. Open RWTNoise.fsh and replace the calculation for position with this:

vec2 position = gl_FragCoord.xy/uResolution.xx;

This ensures that one unit of position is equal to the width of your screen (uResolution.x).

On the next line, add the following if statement:

if ((position.x>1.) || (position.y>1.)) {
  discard;
}

Make sure you give discard a warm welcome back into you code, then build and run to render the image below:

s_Run17

This simple square acts as your new 1×1 pixel shader viewport.

g_Square

Since 2D noise extends infinitely in x and y, if you replace your noise input with either of the following lines below:

float n = randomNoise(position-1.);
float n = randomNoise(position+1.);

This is what you’ll see:

g_SquareMinus
g_SquarePlus

For any noise-based procedural texture, there is a primitive-level distinction between too much noise and not enough noise. Fortunately, tiling your square grid makes it possible to control this.

Add the following lines to main(void), just before n:

float tiles = 2.;
position = floor(position*tiles);

Then build and run! You should see a 2×2 square grid like the one below:

s_Run18

This might be a bit confusing at first, so here’s an explanation:
g_TilesDiagram

floor(position\*tiles) will truncate any value to the nearest integer less than or equal to position*tiles, which lies in the range (0.0, 0.0) to (2.0, 2.0), in both directions.

Without floor(), this range would be continuously smooth and every fragment position would seed noise() with a different value.

However, floor() creates a stepped range with stops at every integer, as shown in the diagram above. Therefore, every position value in-between two integers will be truncated before seeding noise(), creating a nicely-tiled square grid.

The number of square tiles you choose will depend on the type of texture effect you want to create. Perlin noise adds many grids together to compute its noisy pattern, each with a different number of tiles.

There is such a thing as too many tiles, which often results in blocky, repetitive patterns. For example, the square grid for tiles = 128. looks something like this:

g_Tiles128Random

Procedural Textures: Smooth Noise

At the moment, your noise texture is a bit too, ahem, noisy. This is good if you wish to texture an old-school TV set with no signal, or maybe MissingNo.

But what if you want a smoother texture? Well, you would use a smoothing function. Get ready for a shift gears and move onto image processing 101.

In 2D image processing, pixels have a certain connectivity with their neighbors. An 8-connected pixel has eight neighbors surrounding it; four touching at the edges and four touching at the corners.

You might also know this concept as a Moore neighborhood and it looks something like this, where CC is the centered pixel in question:

g_8Connectivity

Note: To learn more about the Moore neighborhood and image processing in general, check out our Image Processing in iOS tutorial series.

A common use of image smoothing operations is attenuating edge frequencies in an image, which produces a blurred/smeared copy of the original. This is great for your square grid because it reduces harsh intensity changes between neighboring tiles.

For example, if white tiles surround a black tile, a smoothing function will adjust the tiles’ color to a lighter gray. Smoothing functions apply to every pixel when you use a convolution kernel, like the one below:

g_FilterNeighborhood

This is a 3×3 neighborhood averaging filter, which simply smooths a pixel value by averaging the values of its 8 neighbors (with equal weighting). To produce the image above, this would be the code:

p = 0.1
p’ = (0.3+0.9+0.5+0.7+0.2+0.8+0.4+0.6+0.1) / 9
p’ = 4.5 / 9
p’ = 0.5

It’s not the most interesting filter, but it’s simple, effective and easy to implement! Open RWTNoise.fsh and add the following function just above main(void):

float smoothNoise(vec2 p) {
  vec2 nn = vec2(p.x, p.y+1.);
  vec2 ne = vec2(p.x+1., p.y+1.);
  vec2 ee = vec2(p.x+1., p.y);
  vec2 se = vec2(p.x+1., p.y-1.);
  vec2 ss = vec2(p.x, p.y-1.);
  vec2 sw = vec2(p.x-1., p.y-1.);
  vec2 ww = vec2(p.x-1., p.y);
  vec2 nw = vec2(p.x-1., p.y+1.);
  vec2 cc = vec2(p.x, p.y);
 
  float sum = 0.;
  sum += randomNoise(nn);
  sum += randomNoise(ne);
  sum += randomNoise(ee);
  sum += randomNoise(se);
  sum += randomNoise(ss);
  sum += randomNoise(sw);
  sum += randomNoise(ww);
  sum += randomNoise(nw);
  sum += randomNoise(cc);
  sum /= 9.;
 
  return sum;
}

It’s a bit long, but also pretty straightforward. Since your square grid is divided into 1×1 tiles, a combination of ±1. in either direction will land you on a neighboring tile. Fragments are batch-processed in parallel by the GPU, so the only way to know about neighboring fragment values in procedural textures is to compute them on the spot.

Modify main(void) to have 128 tiles, and compute n with smoothNoise(position). After those changes, your main(void) function should look like this:

void main(void) {
    vec2 position = gl_FragCoord.xy/uResolution.xx;
    float tiles = 128.;
    position = floor(position*tiles);
    float n = smoothNoise(position);
    gl_FragColor = vec4(vec3(n), 1.);
}

Build and run! You’ve been hit by, you’ve been struck by, a smooooooth functional. :P

g_Tiles128Smooth

Nine separate calls to randomNoise(), for every pixel, are quite the GPU load. It doesn’t hurt to explore 8-connected smoothing functions, but you can produce a pretty good smoothing function with 4-connectivity, also called the Von Neumann neighborhood.

Neighborhood averaging also produces a rather harsh blur, turning your pristine noise into grey slurry. In order to preserve original intensities a bit more, you’ll implement the convolution kernel below:

g_FilterHalf

This new filter reduces neighborhood averaging significantly by having the pixel in question contribute 50% of the final result, with the other 50% coming from its 4 edge-neighbors. For the image above, this would be:

p = 0.1
p’ = (((0.3+0.5+0.2+0.4) / 4) / 2) + (0.1 / 2)
p’ = 0.175 + 0.050
p’ = 0.225

Time for a quick challenge! See if you can implement this half-neighbor-averaging filter in smoothNoise(vec2 p).

Hint: Remember to remove any unnecessary neighbors! Your GPU will thank you and reward you with faster rendering and less griping.

Solution Inside: Smooth Noise Filter SelectShow>

If you didn’t figure it out, take a look at the code in the spoiler, and replace your existing smoothNoise method with it. Reduce your number of tiles to 8., then build and run.

s_Run19

Your texture is starting to look more natural, with smoother transitions between tiles. Compare the image above (smooth noise) with the one below (random noise) to appreciate the impact of the smoothing function.

s_Run20

Great job so far :]

Procedural Textures: Interpolated Noise

The next step for your noise shader is rid the tiles of hard edges by using bilinear interpolation, which is simply linear interpolation on a 2D grid.

g_BilinearInterpolationDiagram

For ease of comprehension, the image below shows the desired sampling points for bilinear interpolation within your noise function roughly translated to your previous 2×2 grid:

g_BilinearInterpolationSketch

Tiles can blend into one another by sampling weighted values from their corners at point P. Since each tile is 1×1 unit, the Q points should be sampling noise like so:

Q11 = smoothNoise(0.0, 0.0);
Q12 = smoothNoise(0.0, 1.0);
Q21 = smoothNoise(1.0, 0.0);
Q22 = smoothNoise(1.0, 1.0);

In code, you achieve this with a simple combination of floor() and ceil() functions for p. Add the following function to RWTNoise.fsh, just above main(void):

float interpolatedNoise(vec2 p) {
  float q11 = smoothNoise(vec2(floor(p.x), floor(p.y)));
  float q12 = smoothNoise(vec2(floor(p.x), ceil(p.y)));
  float q21 = smoothNoise(vec2(ceil(p.x), floor(p.y)));
  float q22 = smoothNoise(vec2(ceil(p.x), ceil(p.y)));
 
  // compute R value
  // return P value
}

GLSL already includes a linear interpolation function called mix().

You’ll use it to compute R1 and R2, using fract(p.x) as the weight between two Q points at the same height on the y-axis. Include this in your code by adding the following lines at the bottom of interpolatedNoise(vec2 p):

float r1 = mix(q11, q21, fract(p.x));
float r2 = mix(q12, q22, fract(p.x));

Finally, interpolate between the two R values by using mix() with fract(p.y) as the floating-point weight. Your function should look like the following:

float interpolatedNoise(vec2 p) {
  float q11 = smoothNoise(vec2(floor(p.x), floor(p.y)));
  float q12 = smoothNoise(vec2(floor(p.x), ceil(p.y)));
  float q21 = smoothNoise(vec2(ceil(p.x), floor(p.y)));
  float q22 = smoothNoise(vec2(ceil(p.x), ceil(p.y)));
 
  float r1 = mix(q11, q21, fract(p.x));
  float r2 = mix(q12, q22, fract(p.x));
 
  return mix (r1, r2, fract(p.y));
}

Since your new function requires smooth, floating-point weights and implements floor() and ceil() for sampling, you must remove floor() from main(void).

Replace the lines:

float tiles = 8.;
position = floor(position*tiles);
float n = smoothNoise(position);

With the following:

float tiles = 8.;
position *= tiles;
float n = interpolatedNoise(position);

Build and run. Those hard tiles are gone…

s_Run21

… but there is still a discernible pattern of “stars”, which is totally expected, by the way.

You’ll get rid of the undesirable pattern with a smoothstep function. smoothstep() is a nicely curved function that uses cubic interpolation, and it’s much nicer than simple linear interpolation.

Smoothstep is the magic salt you can sprinkle over everything to make it better.

“Smoothstep is the magic salt you can sprinkle over everything to make it better.” –Jari Komppa

Add the following line inside interpolatedNoise(vec2 p), at the very beginning:

vec2 s = smoothstep(0., 1., fract(p));

Now you can use s as the smooth-stepped weight for your mix() functions, like so:

float r1 = mix(q11, q21, s.x);
float r2 = mix(q12, q22, s.x);
 
return mix (r1, r2, s.y);

Build and run to make those stars disappear!


s_Run22

The stars are definitely gone, but there’s still a bit of a pattern; almost like a labyrinth. This is simply due to the 8×8 dimensions of your square grid. Reduce tiles to 4., then build and run again!

s_Run23

Much better.
Your noise function is still a bit rough around the edges, but it could serve as a texture primitive for billowy smoke or blurred shadows.

Procedural Textures: Moving Noise

Final stretch! Hope you didn’t forget about little ol’ uTime, because it’s time to animate your noise. Simply add the following line inside main(void), just before assigning n:

position += uTime;

Build and run.

Your noisy texture will appear to be moving towards the bottom-left corner, but what’s really happening is that you’re moving your square grid towards the top-right corner (in the +x, +y direction). Remember that 2D noise extends infinitely in all directions, meaning your animation will be seamless at all times.

Pixel Shader Moon

Hypothesis: Sphere + Noise = Moon? You’re about to find out!
To wrap up this tutorial, you’ll combine your sphere shader and noise shader into a single moon shader in RWTMoon.fsh. You have all the information you need to do this, so this is a great time for a challenge!

Hint: Your noise tiles will now be defined by the sphere’s radius, so replace the following lines:

float tiles = 4.;
position *= tiles;

With a simple:

position /= radius;

Also, I double-dare you to refactor a little bit by completing this function:

float diffuseSphere(vec2 p, float r) {
}
Solution Inside: Werewolves, Beware SelectShow>

Remember to change your program’s fragment shader source to RWTMoon in RWTViewController.m:

self.shader = [[RWTBaseShader alloc] initWithVertexShader:@"RWTBase" fragmentShader:@"RWTMoon"];

While you’re there, feel free to change your glClearColor() to complement the scene a bit more (I chose xkcd’s midnight purple):

glClearColor(.16f, 0.f, .22f, 1.f);

Build and run! Oh yeah, I’m sure Ozzy Osbourne would approve.

s_Run24

Where To Go From Here?

Here is the completed project with all of the code and resources for this OpenGL ES Pixel Shaders tutorial. You can also find its repository on GitHub.

Congratulations, you’ve taken a very deep dive into shaders and GPUs, like a daring math-stronaut, testing all four dimensions, as well as the limits of iOS development itself! This was quite a different and difficult tutorial, so I whole-heartedly applaud your efforts.

You should now understand how to use the immense power of the GPU, combined with clever use of math, to create interesting pixel-by-pixel renderings. You should be comfortable with GLSL functions, syntax and organization too.

There wasn’t much Objective-C in this tutorial, so feel free to go back to your CPU and think of cool ways to manipulate your shaders even more!

Try adding uniform variables for touch points, or gyroscope data, or microphone input. Browser + WebGL may be more powerful, but Mobile + OpenGL ES is certainly be more interesting :]

There are many paths to explore from here on out, and here are a few suggestions:

  • Want to make your shaders super performant? Check out Apple’s OpenGL ES tuning tips (highly recommended for iPhone 5s).
  • Want to read up on Perlin noise and complete your implementation? Please check-in with Ken himself for a quick presentation or detailed history.
  • Think you need to practice the basics? Toby’s got just the thing.
  • Or maybe, just maybe, you think you’re ready for the pros? Well then, please see the masters at work on Shadertoy and drop us a line in the comments if you make any contributions!

In general, I suggest you check out the amazing GLSL Sandbox gallery straight away.

There you can find shaders for all levels and purposes, plus the gallery is edited/curated by some of the biggest names in WebGL and OpenGL ES. They’re the rockstars that inspired this tutorial and are shaping the future of 3D graphics, so a big THANKS to them. (Particularly @mrdoob, @iquilezles, @alteredq.)

If you have any questions, comments or suggestions, feel free to join the discussion below!

OpenGL ES Pixel Shaders Tutorial is a post from: Ray Wenderlich

The post OpenGL ES Pixel Shaders Tutorial appeared first on Ray Wenderlich.

IRC for iOS Developers

$
0
0
Chat with fellow iOS devs on IRC!

Chat with fellow iOS devs on IRC!

I’ve been working from home for over three years now, and while I absolutely love it, one of the things I miss the most about working in an office is camaraderie you have with fellow developers there.

The good news is that in the past year or so, I’ve found my fix with an online alternative: IRC!

IRC is an internet chat protocol that has been around since the beginning of the Internet. You can connect to IRC servers to chat about any subject imaginable – including iOS development, OS X development, and even Swift development.

I believe IRC is a great way to get to know fellow iOS developers, to get help with questions, and to help out others.

That’s why I’m writing this tutorial! This tutorial will help get you started with:

Let’s get chatting!

Note: Special thanks to Matthijs Hollemans and Nimesh Neema for their assistance with some parts of this tutorial!

Choosing an IRC Client and Getting Started

The first step is to choose and download and install an OS X IRC client, and then follow some instructions I’ve provided to connect to a chat room. Here are some of the most popular options:

Again – download and install the client of your choice, and then jump to the appropriate instructions below!

Getting Started: Colloquy

Connecting to an IRC server

Start up Colloquy and go to File\New Connection. For Nickname enter your preferred nickname, for Chat Server enter irc.freenode.net, and click Connect:

001_Colloquy

Back in your list of connections, after a few moments you should see a lightning bolt icon appear – this indicates you are connected. Note that you can always double click a connection to connect.

Registering your nickname

Click the Console button to reveal a connection to the IRC server itself. This will allow you to send some commands to register your nickname, which is a prerequisite to connecting to some of the iOS development channels.

Enter the following command down in the text field at the bottom of the screen and hit enter:

/msg NickServ REGISTER password youremail@example.com

After a few moments, you should see a reply from NickServ letting you know that it has sent you an email:

004_Colloquy

Check your email and enter the command that it tells you in the text field and hit enter to continue. You should see a success message from NickServ.

Back in your Connections list, right click your connection and choose Get Info. Enter the password you set in the password field:

005_Colloquy

Right click on the connection, and choose Disconnect. Then double click to connect again. If you still have your console open, you will see an “authentication successful” message – this means your nickname and password is registered!

Joining a channel

Now for the fun part – joining a chat channel for iOS developers. Click the Join Room button in your Connections window:

006_Colloquy

Make sure the Connection is set to irc.freenode.net, for the Chat Room enter cocoa-init, and click Join:

007_Colloquy

And you’re in! You can use the text field at the bottom to chat.

At this point, feel free to skip ahead to the IRC Channels for iOS Developers section to find out about more channels you can join!

Getting Started: Adium

Connecting to an IRC server

Start up Adium. If the Setup Assistant appears, click the x button to dismiss it.

Then go to File\Add Account\IRC (Internet Relay Chat). For Nickname enter your preferred nickname, for Hostname enter irc.freenode.net, and click OK:

008_Adium

After a few moments, the green icon next to your name should light up to indicate that you are online. Note that you can always use the dropdown to switch your status to available to connect.

Registering your nickname

Go to File\New Chat, make sure that From is set to , set To to NickServ, and click Message. This will allow you to send some commands to register your nickname, which is a prerequisite to connecting to some of the iOS development channels.

009_Adium

Enter the following command down in the text field at the bottom of the screen and hit enter:

REGISTER password youremail@example.com

After a few moments, you should see a reply from NickServ letting you know that it has sent you an email:

010_Adium

Check your email and enter the command that it tells you in the text field (without the /msg NickServ part) and hit enter to continue. You should see a success message from NickServ.

Close the NickServ window. In the Contacts window, choose the dropdown next to Available and set it to Offline to disconnect. Then set it back to Available to reconnect.

After a few moments, NickServ will ask you for your password, so enter the password you set in the password field:

011_Adium

If you don’t see any errors – this means your nickname and password is registered!

Joining a channel

Now for the fun part – joining a chat channel for iOS developers. Go to File\Join Group Chat…, make sure the Account is set to irc.freenode.net, for Channel enter #cocoa-init, and click Join:

012_Adium

And you’re in! You can use the text field at the bottom to chat.

At this point, feel free to skip ahead to the IRC Channels for iOS Developers section to find out about more channels you can join!

Getting Started: Irssi

Connecting to an IRC server

Irssi is different than the other options so far in that everything is on the command line!

Start up Irssi and you’ll see the following:

014_Irssi

Enter these commands to connect to Freenode:

/set nick yournickname
/network add -whois 1 -msgs 4 -kicks 1 -modes 4 freenode
/server add -auto -network freenode irc.freenode.net 6667
/connect freenode

After a few moments you should see some welcome messages from Freenode – this indicates you are connected.

015_Irssi

Registering your nickname

Next you need to send some commands to NickServ to register your nickname, which is a prerequisite to connecting to some of the iOS development channels.

Enter the following command down in the text field at the bottom of the screen and hit enter:

/msg NickServ REGISTER password youremail@example.com

This causes irssi to open a new window – use Command-P to switch to it.

After a few moments, you should see a reply from NickServ letting you know that it has sent you an email in the new window.

Check your email and enter the command that it tells you in the text field (but without the /msg NickServ part) and hit enter to continue. You should see a success message from NickServ.

Hit Command-P to go back to the main window. Enter this command to auto-register with NickServ when you connect from now on:

/network add -autosendcmd "/^msg nickserv identify password;wait 2000" freenode
/save
/quit

Restart irssi, and verify that you automatically connect and register your nickname.

Joining a channel

Now for the fun part – joining a chat channel for iOS developers. Simply enter the following command:

/join #cocoa-init

You will see a list of users in the channel, and you can use the text field at the bottom to chat.

013_Irssi

And you’re in! You can use the text field at the bottom to chat. For more information, check out the Irssi documentation.

At this point, feel free to skip ahead to the IRC Channels for iOS Developers section to find out about more channels you can join!

Getting Started: Textual

Connecting to an IRC server

Start up Textual, click the + button in the lower left, and select Add Server:

016_Textual

For Network Name enter Freenode and for Server Address enter irc.freenode.net:

017_Textual

Switch to the Identity tab, for Nickname enter your preferred nickname, and click Save:

018_Textual

Back in the main window, double click the Freenode entry to connect. You should see a message from the server – this indicates you are connected.

019_Textual

Registering your nickname

Next you need to send some commands to register your nickname, which is a prerequisite to connecting to some of the iOS development channels.

Enter the following command down in the text field at the bottom of the screen and hit enter:

/msg NickServ REGISTER password youremail@example.com

After a few moments, you should see a reply from NickServ letting you know that it has sent you an email:

020_Textual

Check your email and enter the command that it tells you in the text field (without the /msg NickServ part) and hit enter to continue. You should see a success message from NickServ.

Back on the sidebar, right click your Freenode connection and choose Server Properties. In the Identity tab, enter the password you set in the Personal Password field:

021_Textual

Right click on the freenode connection, and choose Disconnect. Then right click and choose Connect to connect again. If you don’t get any errors, this means you’re connected and authenticated successfully!

Joining a channel

Now for the fun part – joining a chat channel for iOS developers. Right click the Freenode entry and choose Join Channel. For Channel enter #cocoa-init, and click Save:

022_Textual

And you’re in! You can use the text field at the bottom to chat.

At this point, feel free to skip ahead to the next section to find out about more channels you can join!

Getting Started: IRCCloud

Note: Some IRC channels ban web-based clients like IRCCloud. You may prefer to use one of the other clients to avoid this.

Connecting to an IRC server

Go to irccloud.com and register for a free account. Once you have signed up, you will be automatically directed to Join a new network screen.

IRCCloud_Homepage

Under hostname enter irc.freenode.net. For Nickname, enter you preferred nickname. Leave other values to defaults and click Join network button.

IRCCloud_JoinNetwork

Registering your nickname

You will need to register your nickname with the server before you can start chatting. Click on freenode shown towards the right side of window to reveal the server console. Here you can send commands to register your nickname, which is required to connect to some of the iOS development channels.

In the text field shown at the bottom of the screen, enter the following command

/msg NickServ REGISTER password youremail@example.com

IRCCloud_RegisterNickname

After a few moments, you should see a reply from NickServ, letting you know that it has sent you an email:

Check your email and enter the command that it tells you in the text field and hit enter to continue. You will see a successfully verified message from NickServ.

IRCCloud_VerifyRegistration

Now click on freenode towards the right side to select the server and click on the Identify Nickname button. Once you are identified succesfully, you are good to join channels.

IRCCloud_IdentifyNickname

Joining a channel

In the text field shown below, enter the following command.

/join #cocoa-init

You will soon be redirected to the #cocoa-init channel screen. You can use the text field at the bottom of the screen to start chatting.

IRCCloud_Chatting

At this point, feel free to skip ahead to the IRC Channels for iOS Developers section to find out about more channels you can join!

IRC Channels for iOS Developers

Now that you’ve successfully connected to IRC, you may be wondering what some good channels are to join. Here are our recommendations:

  • #cocoa-init: This is the channel you connected to in the tutorial. It’s actually a brand new channel, oriented to new developers (and beginner questions) in particular. It’s great if you are either a new Cocoa developer, or if you enjoy helping or meeting newer developers. Kyle Robson, Erica Sadun, and Lyle Andrews are the lead organizers of this channel, and I hang out here from time to time, so stop by and say hi!
  • #swift-lang: Another relatively new channel, focused on the Swift language itself. This channel is particularly active lately and has some nice discussions. Mike Ash hangs out here.
  • #iphonedev: The original and busiest iOS development channel on Freenode. This is the place to go for giving and getting advice on intermediate to advanced topics. Discussions about the official SDK only, no jailbreaking.
  • #iphonedev-chat: This is the sister channel of #iphonedev, for off-topic discussions. Sometimes it’s fun to talk about things other than apps, and this the place to go. It’s great for those water cooler conversations — get your gossip here!
  • #macdev: All the cool kids are doing iOS these days but if you’re old school and make OS X apps, then this is the channel to find likeminded developers. It’s not as busy as the iPhone channels but the regulars here are very knowledgeable.
  • #iphone: For chatting about everything related to the iPhone. This is also a good place to go for jailbreaking questions.

IRC Etiquette

There are a few areas of IRC Etiquette that you should keep in mind.

First, it’s cool to ask questions on IRC, but if you do be sure to try to answer questions and help others as well. Learn the art of asking good questions. If you want to share source code, don’t paste it directly into the channel but use a “pastebin” instead.

Second, note that IRC can be very distracting if you let it. What I personally have found helpful is to simply minimize IRC and ignore it for a while when I get busy or am in the middle of something. Don’t worry, no-one will be insulted if you leave mid-conversation – we all do the same thing :]

Sometimes people who have nothing better to do with their time (usually bored kids) find it funny to troll on IRC. They do this just to get a rise out of people. The best advice is to ignore them. If a troll finds no response, they’ll go away eventually. If the trolling gets really bad, notify one of the channel operators so they can kick the trolls out of the room. Of course, don’t be a troll yourself. ;]

Remember that text — especially in real time chat — lacks the finesse of face-to-face conversation. It’s good to have a thick skin on IRC. It’s easy to get offended — or to offend — and start a flame war, but that spoils the mood for everyone and will get you kicked, or even banned, from the channel. Respect the channel rules.

Tip: Most IRC clients support “tab completion”. So if you want to respond to someone with the nick JonnyAppleseed, just type the first few letters of the nick followed by the tab key, and the IRC app will complete the name for you. Typing “jo<tab>” is a lot quicker than typing the full name.

Be nice, and make friends!

Where To Go From Here?

Enjoy! Remember the whole idea is to have an informal place to chat, help each other out, hang out, and have fun – when you have time to spare and need a “water cooler” moment! :]

Myself and many other IRC fans out there hope to get a chance to talk to you soon!

IRC for iOS Developers is a post from: Ray Wenderlich

The post IRC for iOS Developers appeared first on Ray Wenderlich.

Video Tutorial: Introduction to Unity Part 1: Introduction

Video Tutorial: Introduction to Unity Part 2: Using the Interface

How To Make an App Like RunKeeper: Part 1

$
0
0

23_multicolor_polyline

At the recent WWDC 2014 conference, Apple previewed the upcoming HealthKit API and corresponding Health app. Meanwhile, apps in the Health and Fitness category are hugely popular on the App Store.

RunKeeper, a GPS app like the one you’re about to make, has over 25 million users! What’s clear from this growing trend is:

  • Health is extremely important
  • Smartphones can serve as useful aids for managing and tracking your health
  • Developing fun apps can help people stay motivated, on-track to achieve goals and most importantly, stay healthy!

This tutorial will show you how to make an app like RunKeeper. It will be a motivational run-tracking app. By the end of this tutorial, you’ll have an app that:

  • Uses Core Location to track your route.
  • Shows a map during your run, with a constantly-updating line denoting your path.
  • Continually reports your average pace as you run.
  • Awards badges for running various distances.
    • Silver and gold versions of each badge recognize personal improvements, regardless of your starting point.
  • Encourages you by tracking the remaining distance to the next checkpoint.
  • Shows a map of your route when you’re finished.
    • The map line is color-coded so that the slower portions are red and the faster portions are green.

The result? Your new app, called MoonRunner, with badges based on planets and moons in our Solar System! Yes, you and your users will be traveling through space to earn these badges :]

Before you run headlong into this tutorial, there’s some prerequisite skills and suggested prior reading:

  1. Storyboards: You’ll use Storyboards for the UI on this app, so read this excellent tutorial for a refresher.
  2. Core Data: Knowing where you start and measuring your progress are fundamental to personal training. So, to save data, you’ll use the most widely-used data persistence framework in Cocoa, Core Data. Here is an excellent Core Data tutorial to get you up to speed.
  3. While it’s not required, the Breadcrumb sample project from Apple contains some helpful (but currently deprecated) example code to show a map during the run. You’ll do the same from scratch later in this tutorial, but it doesn’t hurt to brush up on the topic of MapKit if you’re unfamiliar.

There’s so much to talk about that this tutorial comes in two parts. The first segment focuses on recording the run data and rendering the color-coded map. The second segment introduces the badge system.

Getting Started

The first thing to do, of course, is to make your new project.

Open Xcode. Select File\New\Project. Then select iOS\Application\Master-Detail Application.

1_create_proj

Give it the name MoonRunner and make sure to check the Use Core Data option.

2_name_proj

Just like that, you already have a template project built with Core Data and storyboards!

Model: Runs and Locations

Your usage of Core Data in this app is fairly minimal, with only two entities: Runs and Locations.

Open MoonRunner.xcdatamodeld. Then delete the existing Event entity by clicking on it and pressing backspace. Then add two new entitites, called Run and Location. Set up Run with the following properties:

3_run_model

A Run contains a duration, a distance and a timestamp. It also has a relationship called locations that relates to your other object:

Now set up Location with the following properties:

4_location_model

A Location contains a latitude and longitude, as well as a timestamp. Additionally, Locationrelates back to a Run.

Be sure to set the locations relationship as a to-many, ordered relationship.

4_5_locations_relationship

Next, have Xcode generate the model classes. Click File\New\File and then choose Core Data\NSManagedObject subclass.

5_model_files

When the follow-up options present themselves, be sure to include the model for MoonRunner and both Run and Location.

6_select_model

7_manage_files

Alright! And just like that, you’ve completed the Core Data piece of this app.

This tutorial will make use of MapKit, so it needs to be linked. Click on the project at the top of the project navigator and open the target’s Build Phases. Add MapKit to the list inside the Link Binary With Libraries section:

40_add_mapkit

Setting Up the Storyboards

Time to move on to some visuals! First up is laying out the storyboard. Remember, if the following is new to you, or you’re experiencing a temporary memory lapse, refer to the Storyboards tutorial for a bit of help with the concepts.

Open Main.storyboard.

For this first set of features, MoonRunner has a simple flow:

  • A Home screen that presents a few app navigation options
  • A New Run screen where the user begins and records a new run
  • A Run Details screen that shows the results, including the color-coded map

None of these are a UITableViewController, so click on the black bar below the pre-made Master View Controller to select it, then press delete.

9_delete_master_vc

Then, add two new view controllers by dragging two from the object library. Name one of them ‘Home’ by selecting the black bar again, and then setting the title in the attributes inspector. Name the other ‘New Run’.

Control-drag between the Navigation Controller to Home and set the relationship as ‘root view’.

11_storyboard_rootview

Then control-drag from the yellow icon on the New Run view controller’s black bar to the Detail screen and set the segue to push.

12_storyboard_push

Click on the segue that you just added. Then in the Attributes Inspector, assign the name ‘RunDetails’ to the segue.

13_name_push_segue

Behold…your Storyboards:

10_new_vcs_2

Now you’re going to lay out each of the screens.

Fleshing out the Storyboards

Please download the resource package. Then unzip the package and drag all the files into project. Make sure you select “Copy items into destination group’s folder (if needed)”.

Open Main.storyboard and find the ‘Home’ view controller.

The ‘Home’ view controller is the main menu for your app. Drag in a UILabel to the top of the View, and add a warm, welcoming message like ‘Welcome To MoonRunner!’

Then drag in three UIButtons, and give them the titles ‘New Run’, ‘Past Runs’ and ‘My Badges’.

In the starter pack there are green-btn and blue-btn assets, and I used a black background withwhite text to make the app look like this:

14_home_storyboard

Then, control-drag from the ‘New Run’ button to the ‘New Run’ screen, and select a push segue:

15_btn_segue

The New Run screen has two modes: pre-run and during-run. The View Controller handles the logic of how each one displays, but for now drag out three UILabels: one each to display real-time updates of distance, time and pace.

Add a UILabel as a prompt (For example, “Ready To Launch?”), and two UIButtons to Start/Stop the run.

I used a black background and the green-btn and red-btn assets to provide a little styling. If you do the same, your storyboard will look like this:

16_new_run_storyboard

Lastly, the Run Details screen is where the user sees a map of their route, along with other details that relate to a specific run.

Find the ‘Detail View Controller’ on the storyboard. Then delete the existing label and add an MKMapView, as well as four UILabels. The labels will be used for distance, date, time and pace.

With the usual, void-of-space-black background, your storyboard should look like this:

17_run_details_storyboard

Great! You’ll come back to these to make a few connections, but for now it’s time to get a little control over things…with Controllers.

Completing the Basic App Flow

Xcode did a good chunk of the heavy lifting for you in the Master-Detail app template, but you don’t need what’s in the MasterViewController. So go ahead and delete MasterViewController.h and MasterViewController.m.

Then replace it with a new, fresh HomeViewController by choosing File\New\File and select the iOS\Cocoa Touch\Objective-C class template. Choose UIViewController as your subclass, and name the new class HomeViewController.

You’ll need to feed this View Controller an NSManagedObjectContext from the App Delegate, so add a new NSManagedObjectContext property to HomeViewController.h. It should now look like this:

#import <UIKit/UIKit.h>
 
@interface HomeViewController : UIViewController
 
@property (strong, nonatomic) NSManagedObjectContext *managedObjectContext;
 
@end

Move back to AppDelegate.m for a moment — this file has errors since you removed MasterViewController. Oh no!

Actually, it’s not a big deal. Replace the import of MasterViewController.h at the top with this:

#import "HomeViewController.h"

Next, make application:didFinishLaunchingWithOptions: look like this:

- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
    UINavigationController *navigationController = (UINavigationController *)self.window.rootViewController;
    HomeViewController *controller = (HomeViewController *)navigationController.topViewController;
    controller.managedObjectContext = self.managedObjectContext;
    return YES;
}

Great! Now add another file by selecting File\New\File and select the iOS>Cocoa Touch>Objective-C class template. Call the file NewRunViewController, once again inheriting from UIViewController.

Make the same changes to NewRunViewController.h as you did to HomeViewController.h:

#import <UIKit/UIKit.h>
 
@interface NewRunViewController : UIViewController
 
@property (strong, nonatomic) NSManagedObjectContext *managedObjectContext;
 
@end

Open NewRunViewController.m and add the following to the top of the file:

#import "DetailViewController.h"
#import "Run.h"
 
static NSString * const detailSegueName = @"RunDetails";
 
@interface NewRunViewController () <UIActionSheetDelegate>
 
@property (nonatomic, strong) Run *run;
 
@property (nonatomic, weak) IBOutlet UILabel *promptLabel;
@property (nonatomic, weak) IBOutlet UILabel *timeLabel;
@property (nonatomic, weak) IBOutlet UILabel *distLabel;
@property (nonatomic, weak) IBOutlet UILabel *paceLabel;
@property (nonatomic, weak) IBOutlet UIButton *startButton;
@property (nonatomic, weak) IBOutlet UIButton *stopButton;
 
@end

Now you have Interface Builder outlets for each of the views in your storyboard, as well as a constant string for the segue name.

Did you notice that you’ll need to implement the UIActionSheetDelegate protocol? You’ll do that very soon.

The next steps are to define the initial state of the UI and the actions related to button presses.

Add the following method to the main implementation body of NewRunViewController:

- (void)viewWillAppear:(BOOL)animated
{
    [super viewWillAppear:animated];
 
    self.startButton.hidden = NO;
    self.promptLabel.hidden = NO;
 
    self.timeLabel.text = @"";
    self.timeLabel.hidden = YES;
    self.distLabel.hidden = YES;
    self.paceLabel.hidden = YES;
    self.stopButton.hidden = YES;
}

The only things visible at first are the start button and the prompt. Next, add the following methods:

-(IBAction)startPressed:(id)sender
{
    // hide the start UI
    self.startButton.hidden = YES;
    self.promptLabel.hidden = YES;
 
    // show the running UI
    self.timeLabel.hidden = NO;
    self.distLabel.hidden = NO;
    self.paceLabel.hidden = NO;
    self.stopButton.hidden = NO;
}
 
- (IBAction)stopPressed:(id)sender
{
    UIActionSheet *actionSheet = [[UIActionSheet alloc] initWithTitle:@"" delegate:self 
            cancelButtonTitle:@"Cancel" destructiveButtonTitle:nil 
            otherButtonTitles:@"Save", @"Discard", nil];
    actionSheet.actionSheetStyle = UIActionSheetStyleDefault;
    [actionSheet showInView:self.view];
}

This code provides two actions to be used when the two buttons are pressed. Pressing the start button switches the UI into “during-run” mode, while pressing the stop button shows a UIActionSheet so the user can decide whether to save or discard the run.

Now you need to make sure something’s there to respond to the user’s choice on the UIActionSheet. Add the following method:

- (void)actionSheet:(UIActionSheet *)actionSheet clickedButtonAtIndex:(NSInteger)buttonIndex
{
    // save
    if (buttonIndex == 0) {
        [self performSegueWithIdentifier:detailSegueName sender:nil];
 
    // discard
    } else if (buttonIndex == 1) {
        [self.navigationController popToRootViewControllerAnimated:YES];
    }
}

This will perform the detail segue if the ‘Save’ button is tapped, and pop to the root (i.e. the home screen) if the ‘Discard’ button is tapped.

Finally, add the following method:

- (void)prepareForSegue:(UIStoryboardSegue *)segue sender:(id)sender
{
    [[segue destinationViewController] setRun:self.run];
}

This sets up the DetailViewController with the current run when the segue is triggered.

For now, the run is empty, but that’s because you haven’t taken or run — or finished the app for that matter. You can see where this is going though, that run will soon be constructed from all the locations read during the run.

You’re done with NewRunViewController.m for now!

Open DetailViewController.h. Make it look like this:

#import <UIKit/UIKit.h>
 
@class Run;
 
@interface DetailViewController : UIViewController
 
@property (strong, nonatomic) Run *run;
 
@end

This adds a Run property to the detail view controller. In fact, the one you set in the segue just now! This will be the run that the detail view controller is set up to show details for.

Then open DetailViewController.m and replace the entire file with the following code:

#import "DetailViewController.h"
#import <MapKit/MapKit.h>
 
@interface DetailViewController () <MKMapViewDelegate>
 
@property (nonatomic, weak) IBOutlet MKMapView *mapView;
@property (nonatomic, weak) IBOutlet UILabel *distanceLabel;
@property (nonatomic, weak) IBOutlet UILabel *dateLabel;
@property (nonatomic, weak) IBOutlet UILabel *timeLabel;
@property (nonatomic, weak) IBOutlet UILabel *paceLabel;
 
@end
 
@implementation DetailViewController
 
#pragma mark - Managing the detail item
 
- (void)setRun:(Run *)run
{
    if (_run != run) {
        _run = run;
        [self configureView];
    }
}
 
- (void)configureView
{
}
 
- (void)viewDidLoad
{
    [super viewDidLoad];
    [self configureView];
}
 
@end

This imports MapKit, so that you can make use of MKMapView. It also adds outlets for the various parts of the UI. Then it sets up the basis for the rest of the code. A method that will configure the view for the current run, which is called when the view is loaded and when a new run is set.

There’s just one last step before you’re ready to return to your Storyboards and connect all your outlets. Open HomeViewController.m and add the following import at the top of the file:

#import "NewRunViewController.h"

Then add the following method in the implementation:

- (void)prepareForSegue:(UIStoryboardSegue *)segue sender:(id)sender
{
    UIViewController *nextController = [segue destinationViewController];
    if ([nextController isKindOfClass:[NewRunViewController class]]) {
        ((NewRunViewController *) nextController).managedObjectContext = self.managedObjectContext;
    }
}

Finally, head back to your storyboard and set all the following (again, if you need a refresher on storyboards, refer here):

  1. Set the class of the Home View Controller to HomeViewController.
  2. Set the class of the New Run View Controller to NewRunViewController.
  3. Connect all the outlets of the NewRunViewController and DetailViewController.
  4. Connect both the received actions (startPressed: and stopPressed:) in NewRunViewController.
  5. Connect the MKMapView to DetailViewController as its delegate.

After you’ve done all that, you’ve hit the first checkpoint. Build and Run!

41_initial_flow

Your app should now have the very basics of its UI: you should be able to navigate between the three screens, and “start” and “stop” a run. Now it’s time to really get into the guts of the app!

Math and Units

Note that several of the views you created and attached in the storyboard involve displaying statistics and times. Core Location sensibly measures everything in the units of science, aka the metric system. However, a good chunk of your users live in the United States and use silly units like miles, so you need to make sure both systems are live in your app. :]

Click File\New\File. Select iOS\Cocoa Touch\Objective-C class. Call the file MathController, inheriting from NSObject and create it. Then open MathController.h and make the file look like this:

#import <Foundation/Foundation.h>
 
@interface MathController : NSObject
 
+ (NSString *)stringifyDistance:(float)meters;
+ (NSString *)stringifySecondCount:(int)seconds usingLongFormat:(BOOL)longFormat;
+ (NSString *)stringifyAvgPaceFromDist:(float)meters overTime:(int)seconds;
 
@end

Then open MathController.m and add the following to the top of the file.

static bool const isMetric = YES;
static float const metersInKM = 1000;
static float const metersInMile = 1609.344;

Those in the US can change isMetric to NO if they wish! :]

Next, add the following methods to the implementation:

+ (NSString *)stringifyDistance:(float)meters
{    
    float unitDivider;
    NSString *unitName;
 
    // metric
    if (isMetric) {
        unitName = @"km";
        // to get from meters to kilometers divide by this
        unitDivider = metersInKM;
    // U.S.
    } else {
        unitName = @"mi";
        // to get from meters to miles divide by this
        unitDivider = metersInMile;
    }
 
    return [NSString stringWithFormat:@"%.2f %@", (meters / unitDivider), unitName];
}
 
+ (NSString *)stringifySecondCount:(int)seconds usingLongFormat:(BOOL)longFormat
{
    int remainingSeconds = seconds;
    int hours = remainingSeconds / 3600;
    remainingSeconds = remainingSeconds - hours * 3600;
    int minutes = remainingSeconds / 60;
    remainingSeconds = remainingSeconds - minutes * 60;
 
    if (longFormat) {
        if (hours > 0) {
            return [NSString stringWithFormat:@"%ihr %imin %isec", hours, minutes, remainingSeconds];
        } else if (minutes > 0) {
            return [NSString stringWithFormat:@"%imin %isec", minutes, remainingSeconds];
        } else {
            return [NSString stringWithFormat:@"%isec", remainingSeconds];
        }
    } else {
        if (hours > 0) {
            return [NSString stringWithFormat:@"%02i:%02i:%02i", hours, minutes, remainingSeconds];
        } else if (minutes > 0) {
            return [NSString stringWithFormat:@"%02i:%02i", minutes, remainingSeconds];
        } else {
            return [NSString stringWithFormat:@"00:%02i", remainingSeconds];
        }
    }
}
 
+ (NSString *)stringifyAvgPaceFromDist:(float)meters overTime:(int)seconds
{
    if (seconds == 0 || meters == 0) {
        return @"0";
    }
 
    float avgPaceSecMeters = seconds / meters;
 
    float unitMultiplier;
    NSString *unitName;
 
    // metric
    if (isMetric) {
        unitName = @"min/km";
        unitMultiplier = metersInKM;
    // U.S.
    } else {
        unitName = @"min/mi";
        unitMultiplier = metersInMile;
    }
 
    int paceMin = (int) ((avgPaceSecMeters * unitMultiplier) / 60);
    int paceSec = (int) (avgPaceSecMeters * unitMultiplier - (paceMin*60));
 
    return [NSString stringWithFormat:@"%i:%02i %@", paceMin, paceSec, unitName];
}

These methods are helpers that convert from distances, time and speeds into pretty strings. I won’t explain them in too much detail because they should be self explanatory. If you want a challenge later, why not localize the numbers. Refer to this tutorial on the subject if you would like to do this.

Starting the Run

Do you remember who said, “Even a marathon begins with just a single CLLocationManager update.”? Probably no one — until just now

Next up, you’ll start recording the location of the user for the duration of the run.

You need to make one important project-level change first. Click on the project at the top of the project navigator. Then select the Capabilities tab and open up Background Modes. Turn on the switch for this section on the right and then tick Location Updates. This will allow the app to update the location even if the user presses the home button to take a call, browse the net or find out where the nearest Starbucks is! Neat!

Background modes

Did you know? If your app goes in the App Store, you’ll have to attach this disclaimer to your app’s description: “Continued use of GPS running in the background can dramatically decrease battery life.”

Now, back to the code. Open NewRunViewController.m. Then add the following imports at the top of the file:

#import <CoreLocation/CoreLocation.h>
#import "MathController.h"
#import "Location.h"

Next add the CLLocationManagerDelegate protocol conformance declaration and several new properties to the class extension category:

@interface NewRunViewController () <UIActionSheetDelegate, CLLocationManagerDelegate>
 
@property int seconds;
@property float distance;
@property (nonatomic, strong) CLLocationManager *locationManager;
@property (nonatomic, strong) NSMutableArray *locations;
@property (nonatomic, strong) NSTimer *timer;
 
...
  • seconds tracks the duration of the run, in seconds.
  • distance holds the cumulative distance of the run, in meters.
  • locationManager is the object you’ll tell to start or stop reading the user’s location.
  • locations is an array to hold all the Location objects that will roll in.
  • timer will fire each second and update the UI accordingly.

Now add the following method to the implementation:

- (void)viewWillDisappear:(BOOL)animated
{
    [super viewWillDisappear:animated];
    [self.timer invalidate];
}

With this method, the timer is stopped when the user navigates away from the view.

Add the following method:

- (void)eachSecond
{
    self.seconds++;
    self.timeLabel.text = [NSString stringWithFormat:@"Time: %@",  [MathController stringifySecondCount:self.seconds usingLongFormat:NO]];
    self.distLabel.text = [NSString stringWithFormat:@"Distance: %@", [MathController stringifyDistance:self.distance]];
    self.paceLabel.text = [NSString stringWithFormat:@"Pace: %@",  [MathController stringifyAvgPaceFromDist:self.distance overTime:self.seconds]];
}

This is the method that will be called every second, by using an NSTimer (which will be set up shortly). Each time this method is called, you increment the second count and update each of the statistics labels accordingly.

And guess what? Another method! Add the following:

- (void)startLocationUpdates
{
    // Create the location manager if this object does not
    // already have one.
    if (self.locationManager == nil) {
        self.locationManager = [[CLLocationManager alloc] init];
    }
 
    self.locationManager.delegate = self;
    self.locationManager.desiredAccuracy = kCLLocationAccuracyBest;
    self.locationManager.activityType = CLActivityTypeFitness;
 
    // Movement threshold for new events.
    self.locationManager.distanceFilter = 10; // meters
 
    [self.locationManager startUpdatingLocation];
}

If needed, you make a CLLocationManager, and set its delegate to this class so it knows where to send all the location updates.

You then feed it a desiredAccuracy of kCLLocationAccuracyBest. You may think this is silly; why ask for anything less than the best?

The device can make a precise reading with GPS hardware, but this is expensive in terms of battery usage. It can also do a lower-power, roughly accurate reading using radios that are already turned on, such as Wi-Fi or cell-tower readings.

Yes, the GPS route takes more battery, and there are some use cases for the less accurate readings, e.g. where you only need to verify that a user is in a general area. However, this app tracks your actual run, so it needs to be as accurate as possible.

The activityType parameter is made specifically for an app like this. It intelligently helps the device to save some power throughout a user’s run, say if they stop to cross a road.

Lastly, you set a distanceFilter of 10 meters. As opposed to the activityType, this parameter doesn’t affect battery life. The activityType is for readings, and the distanceFilter is for the reporting of readings.

As you’ll see after doing a test run later, the location readings can be a little zigged or zagged away from a straight line.

A higher distanceFilter could reduce the zigging and zagging and thus give you a more accurate line. Unfortunately, too high a filter would pixelate your readings. That’s why 10 meters is a good balance.

Finally, you tell the manager to start getting location updates! Time to hit the pavement. Wait! Are you lacing up your shoes? Not that kind of run!

To actually begin the run, add these lines to the end of startPressed::

self.seconds = 0;
self.distance = 0;
self.locations = [NSMutableArray array];
self.timer = [NSTimer scheduledTimerWithTimeInterval:(1.0) target:self
        selector:@selector(eachSecond) userInfo:nil repeats:YES];
[self startLocationUpdates];

Here all the fields that will update continually throughout the run are reset.

Recording the Run

You’ve created the CLLocationManager, but now you need to get updates from it. That is done through it’s delegate. Open NewRunController.m once again and add the following method:

- (void)locationManager:(CLLocationManager *)manager
     didUpdateLocations:(NSArray *)locations
{
    for (CLLocation *newLocation in locations) {
        if (newLocation.horizontalAccuracy < 20) {
 
            // update distance
            if (self.locations.count > 0) {
                self.distance += [newLocation distanceFromLocation:self.locations.lastObject];
            }
 
            [self.locations addObject:newLocation];
        }
    }
}

This will be called each time there are new location updates to provide the app.

It updates quite often with an array of CLLocations. Usually this array only contains one object, but if there are more, they are ordered by time with the most recent location last.

A CLLocation contains some great information. Namely the latitude and longitude, along with the timestamp of the reading.

But before blindly accepting the reading, it’s worth a horizontalAccuracy check. If the device isn’t confident it has a reading within 20 meters of the user’s actual location, it’s best to keep it out of your dataset.

Note: This check is especially important at the start of the run, when the device first starts narrowing down the general area of the user. At that stage, it’s likely to update with some inaccurate data for the first few points.

If the CLLocation passes the check, then the distance between it and the most recent point is added to the cumulative distance of the run. The distanceFromLocation: method is very convenient here, taking into account all sorts of surprisingly-difficult math involving the Earth’s curvature.

Finally, the location object itself is added to a growing array of locations.

Note: The CLLocation object also contains information on altitude, with a corresponding verticalAccuracy value. As every runner knows, hills can be a game changer on any run, and altitude can affect the amount of oxygen available. A challenge to you, then, is to think of a way to incorporate this data into the app.

Saving the Run

At some point, despite that voice of motivation inside you that tells you to keep going (mine sounds like a high school coach), there comes a time to end the run. You already arranged for the UI to accept this input, and now it’s time to process that data.

Add this method to NewRunViewController.m:

- (void)saveRun
{
    Run *newRun = [NSEntityDescription insertNewObjectForEntityForName:@"Run" 
            inManagedObjectContext:self.managedObjectContext];
 
    newRun.distance = [NSNumber numberWithFloat:self.distance];
    newRun.duration = [NSNumber numberWithInt:self.seconds];
    newRun.timestamp = [NSDate date];
 
    NSMutableArray *locationArray = [NSMutableArray array];
    for (CLLocation *location in self.locations) {
        Location *locationObject = [NSEntityDescription insertNewObjectForEntityForName:@"Location"
                                                                 inManagedObjectContext:self.managedObjectContext];
 
        locationObject.timestamp = location.timestamp;
        locationObject.latitude = [NSNumber numberWithDouble:location.coordinate.latitude];
        locationObject.longitude = [NSNumber numberWithDouble:location.coordinate.longitude];
        [locationArray addObject:locationObject];
    }
 
    newRun.locations = [NSOrderedSet orderedSetWithArray:locationArray];
    self.run = newRun;
 
    // Save the context.
    NSError *error = nil;
    if (![self.managedObjectContext save:&error]) {
        NSLog(@"Unresolved error %@, %@", error, [error userInfo]);
        abort();
    }
}

So what’s happening here? If you’ve done a simple Core Data flow before, this should look like a familiar way to save new objects. You create a new Run object, and give it the cumulative distance and duration values as well as assign it a timestamp.

Each of the CLLocation objects recorded during the run is trimmed down to a new Location object and saved. The locations link with the run, and then you’re good to go!

Finally, edit actionSheet:clickedButtonAtIndex: so that you stop reading locations and that you save the run before performing the segue. Find the method and make it look like this:

- (void)actionSheet:(UIActionSheet *)actionSheet clickedButtonAtIndex:(NSInteger)buttonIndex
{
    [self.locationManager stopUpdatingLocation];
 
    // save
    if (buttonIndex == 0) {
        [self saveRun]; ///< ADD THIS LINE
        [self performSegueWithIdentifier:detailSegueName sender:nil];
 
    // discard
    } else if (buttonIndex == 1) {
        [self.navigationController popToRootViewControllerAnimated:YES];
    }
}

Send the Simulator On a Run

As much as I hope that this tutorial and the apps you build encourage more enthusiasm for fitness, the Build and Run phase does not need to be taken that literally while developing it.

You don’t need to lace up and head out the door either, for there’s a way to get the simulator to pretend it’s running!

Build & runin the simulator, and select Debug\Location\City Run to have the simulator start generating mock data:

20_simulator_run

Of course, this is much easier and less exhausting than taking a short run to testing this — or any other — location-based app.

However, I recommend eventually doing a true beta test with a device. Doing so gives you the chance to fine-tune the location manager parameters and assess the quality of location data you can really get.

Thorough testing could help instill a healthy habit, too. :]

Revealing the Map

Now it’s time to show the map post-run stats!

Open DetailViewController.m and add these imports to the top:

#import "MathController.h"
#import "Run.h"
#import "Location.h"

Then, make configureView look like this:

- (void)configureView
{
    self.distanceLabel.text = [MathController stringifyDistance:self.run.distance.floatValue];
 
    NSDateFormatter *formatter = [[NSDateFormatter alloc] init];
    [formatter setDateStyle:NSDateFormatterMediumStyle];
    self.dateLabel.text = [formatter stringFromDate:self.run.timestamp];
 
    self.timeLabel.text = [NSString stringWithFormat:@"Time: %@",  [MathController stringifySecondCount:self.run.duration.intValue usingLongFormat:YES]];
 
    self.paceLabel.text = [NSString stringWithFormat:@"Pace: %@",  [MathController stringifyAvgPaceFromDist:self.run.distance.floatValue overTime:self.run.duration.intValue]];
}

This sets up the details of the run into the various labels.

Rendering the map will require just a little more detail. There are three basics steps to it. First, the region needs to be set so that only the run is shown and not the entire world! Then the line drawn over the top to indicate where the run went needs to be created and finally styled.

Add the following method:

- (MKCoordinateRegion)mapRegion
{
    MKCoordinateRegion region;
    Location *initialLoc = self.run.locations.firstObject;
 
    float minLat = initialLoc.latitude.floatValue;
    float minLng = initialLoc.longitude.floatValue;
    float maxLat = initialLoc.latitude.floatValue;
    float maxLng = initialLoc.longitude.floatValue;
 
    for (Location *location in self.run.locations) {
        if (location.latitude.floatValue < minLat) {
            minLat = location.latitude.floatValue;
        }
        if (location.longitude.floatValue < minLng) {
            minLng = location.longitude.floatValue;
        }
        if (location.latitude.floatValue > maxLat) {
            maxLat = location.latitude.floatValue;
        }
        if (location.longitude.floatValue > maxLng) {
            maxLng = location.longitude.floatValue;
        }
    }
 
    region.center.latitude = (minLat + maxLat) / 2.0f;
    region.center.longitude = (minLng + maxLng) / 2.0f;
 
    region.span.latitudeDelta = (maxLat - minLat) * 1.1f; // 10% padding
    region.span.longitudeDelta = (maxLng - minLng) * 1.1f; // 10% padding
 
    return region;
}

An MKCoordinateRegion represents the display region for the map, and you define it by supplying a center point and a span that defines horizontal and vertical ranges.

For example, my jog may be quite zoomed in around my short route, while my more athletic friend’s map will appear more zoomed out to cover all the distance she traveled. :]

It’s important to also account for a little padding, so that the route doesn’t crowd all the way to the edge of the map.

Next, add the following method:

- (MKOverlayRenderer *)mapView:(MKMapView *)mapView rendererForOverlay:(id < MKOverlay >)overlay
{
    if ([overlay isKindOfClass:[MKPolyline class]]) {
        MKPolyline *polyLine = (MKPolyline *)overlay;
        MKPolylineRenderer *aRenderer = [[MKPolylineRenderer alloc] initWithPolyline:polyLine];
        aRenderer.strokeColor = [UIColor blackColor];
        aRenderer.lineWidth = 3;
        return aRenderer;
    }
 
    return nil;
}

This method says that whenever the map comes across a request to add an overlay, it should check if it’s a MKPolyline. If so, it should use a renderer that will make a black line. You’ll spice this up shortly. An overlay is something that is drawn on top of a map view. A polyline is such an overlay and represents a line drawn from a series of location points.

Lastly, you need to define the coordinates for the polyline. Add the following method:

- (MKPolyline *)polyLine {
 
    CLLocationCoordinate2D coords[self.run.locations.count];
 
    for (int i = 0; i < self.run.locations.count; i++) {
        Location *location = [self.run.locations objectAtIndex:i];
        coords[i] = CLLocationCoordinate2DMake(location.latitude.doubleValue, location.longitude.doubleValue);
    }
 
    return [MKPolyline polylineWithCoordinates:coords count:self.run.locations.count];
}

Here, you shoved the data from the Location objects into an array of CLLocationCoordinate2D, the required format for polylines.

Now, it’s time to put these three together! Add the following method:

- (void)loadMap
{
    if (self.run.locations.count > 0) {
 
        self.mapView.hidden = NO;
 
        // set the map bounds
        [self.mapView setRegion:[self mapRegion]];
 
        // make the line(s!) on the map
        [self.mapView addOverlay:[self polyLine]];
 
    } else {
 
        // no locations were found!
        self.mapView.hidden = YES;
 
        UIAlertView *alertView = [[UIAlertView alloc]
                                  initWithTitle:@"Error"
                                  message:@"Sorry, this run has no locations saved."
                                  delegate:nil
                                  cancelButtonTitle:@"OK"
                                  otherButtonTitles:nil];
        [alertView show];
    }
}

Here, you make sure that there are points to draw, set the map region as defined earlier, and add the polyline overlay. Add the following code at the end of configureView:

[self loadMap];

And now build & run!. You should now see a map after your simulator is done with its workout.

notice how the default debug location just happens to be next-door to Apple :]

notice how the default debug location just happens to be next-door to Apple :]

Finding the Right Color

The app is pretty cool as-is, but one way you can help your users train even smarter is to show them how fast or slow they ran at each leg of the run. That way, they can identify areas where they are most at risk of straying from an even pace.

Click File\New\File. Select iOS\Cocoa Touch\Objective-C class. Call the file MulticolorPolylineSegment, that inherits from MKPolyline and create it. Then open MulticolorPolylineSegment.h and make it look like this:

#import <MapKit/MapKit.h>
 
@interface MulticolorPolylineSegment : MKPolyline
 
@property (strong, nonatomic) UIColor *color;
 
@end

This special, custom, polyline will be used to render each segment of the run. The color is going to denote the speed and so the color of each segment is stored here on the polyline. Other than that, it’s the same as an MKPolyline. There will be one of these objects for each segment connecting two locations.

Next, you need to figure out how to assign the right color to the right polyline segment. That sounds like… math! Open MathController.h and add the following class method declaration:

+ (NSArray *)colorSegmentsForLocations:(NSArray *)locations;

Then open MathController.m and add the following imports at the top of the file:

#import "Location.h"
#import "MulticolorPolylineSegment.h"

Then add the following implementation of the method you just declared in the header:

+ (NSArray *)colorSegmentsForLocations:(NSArray *)locations
{
    // make array of all speeds, find slowest+fastest
    NSMutableArray *speeds = [NSMutableArray array];
    double slowestSpeed = DBL_MAX;
    double fastestSpeed = 0.0;
 
    for (int i = 1; i < locations.count; i++) {
        Location *firstLoc = [locations objectAtIndex:(i-1)];
        Location *secondLoc = [locations objectAtIndex:i];
 
        CLLocation *firstLocCL = [[CLLocation alloc] initWithLatitude:firstLoc.latitude.doubleValue longitude:firstLoc.longitude.doubleValue];
        CLLocation *secondLocCL = [[CLLocation alloc] initWithLatitude:secondLoc.latitude.doubleValue longitude:secondLoc.longitude.doubleValue];
 
        double distance = [secondLocCL distanceFromLocation:firstLocCL];
        double time = [secondLoc.timestamp timeIntervalSinceDate:firstLoc.timestamp];
        double speed = distance/time;
 
        slowestSpeed = speed < slowestSpeed ? speed : slowestSpeed;
        fastestSpeed = speed > fastestSpeed ? speed : fastestSpeed;
 
        [speeds addObject:@(speed)];
    }
 
    return speeds;
}

This method returns the array of speed values for each sequential pair of locations.

The first thing you’ll notice is a loop through all the locations from the input. You have to convert each Location to a CLLocation so you can use distanceFromLocation:.

Remember basic physics: distance divided by time equals speed. Each location after the first is compared to the one before it, and by the end of the loop you have a complete collection of all the changes in speed throughout the run.

Next, add the following code, just before the return in the method you just added:

// now knowing the slowest+fastest, we can get mean too
double meanSpeed = (slowestSpeed + fastestSpeed)/2;
 
// RGB for red (slowest)
CGFloat r_red = 1.0f;
CGFloat r_green = 20/255.0f;
CGFloat r_blue = 44/255.0f;
 
// RGB for yellow (middle)
CGFloat y_red = 1.0f;
CGFloat y_green = 215/255.0f;
CGFloat y_blue = 0.0f;
 
// RGB for green (fastest)
CGFloat g_red = 0.0f;
CGFloat g_green = 146/255.0f;
CGFloat g_blue = 78/255.0f;

Here you define the three colors you’ll use for slow, medium and fast polyline segments.

Each color, in turn, has its own RGB components. The slowest components will be completely red, the middle will be yellow, and the fastest will be green. Everything else will be a blend of the two nearest colors, so the end result could be quite colorful.

22_color_codes

And finally, remove the existing return and add the following to the end of the method:

NSMutableArray *colorSegments = [NSMutableArray array];
 
for (int i = 1; i < locations.count; i++) {
  Location *firstLoc = [locations objectAtIndex:(i-1)];
  Location *secondLoc = [locations objectAtIndex:i];
 
  CLLocationCoordinate2D coords[2];
  coords[0].latitude = firstLoc.latitude.doubleValue;
  coords[0].longitude = firstLoc.longitude.doubleValue;
 
  coords[1].latitude = secondLoc.latitude.doubleValue;
  coords[1].longitude = secondLoc.longitude.doubleValue;
 
  NSNumber *speed = [speeds objectAtIndex:(i-1)];
  UIColor *color = [UIColor blackColor];
 
  // between red and yellow
  if (speed.doubleValue < meanSpeed) {
    double ratio = (speed.doubleValue - slowestSpeed) / (meanSpeed - slowestSpeed);
    CGFloat red = r_red + ratio * (y_red - r_red);
    CGFloat green = r_green + ratio * (y_green - r_green);
    CGFloat blue = r_blue + ratio * (y_blue - r_blue);
    color = [UIColor colorWithRed:red green:green blue:blue alpha:1.0f];
 
    // between yellow and green
  } else {
    double ratio = (speed.doubleValue - meanSpeed) / (fastestSpeed - meanSpeed);
    CGFloat red = y_red + ratio * (g_red - y_red);
    CGFloat green = y_green + ratio * (g_green - y_green);
    CGFloat blue = y_blue + ratio * (g_blue - y_blue);
    color = [UIColor colorWithRed:red green:green blue:blue alpha:1.0f];
  }
 
  MulticolorPolylineSegment *segment = [MulticolorPolylineSegment polylineWithCoordinates:coords count:2];
  segment.color = color;
 
  [colorSegments addObject:segment];
}
 
return colorSegments;

In this loop, you determine the value of each pre-calculated speed, relative to the full range of speeds. This ratio then determines the UIColor to apply to the segment.

Next, you construct a new MulticolorPolylineSegment with the two coordinates and the blended color.

Finally, you collect all the multicolored segments together, and you’re almost ready to render!

Applying the Multicolored Segments

Repurposing the detail view controller to use your new multicolor polyline is actually quite simple! Open DetailViewController.m and add the following import to the top of the file:

#import "MulticolorPolylineSegment.h"

Now, find loadMap:. Replace the following line:

[self.mapView addOverlay:[self polyLine]];

with:

NSArray *colorSegmentArray = [MathController colorSegmentsForLocations:self.run.locations.array];
[self.mapView addOverlays:colorSegmentArray];

This creates the array of segments using the math controller and adds all the overlays to the map.

Lastly, you need to prepare your polyline renderer to pay attention to the specific color of each segment. So replace your current implementation of mapView:rendererForOverlay: with the following:

- (MKOverlayRenderer *)mapView:(MKMapView *)mapView rendererForOverlay:(id < MKOverlay >)overlay
{
    if ([overlay isKindOfClass:[MulticolorPolylineSegment class]]) {
        MulticolorPolylineSegment *polyLine = (MulticolorPolylineSegment *)overlay;
        MKPolylineRenderer *aRenderer = [[MKPolylineRenderer alloc] initWithPolyline:polyLine];
        aRenderer.strokeColor = polyLine.color;
        aRenderer.lineWidth = 3;
        return aRenderer;
    }
 
    return nil;
}

This is very similar to what you had before, but now the specific color of each segment renders individually.

Alright! Now you’re all set to build & run, let the simulator go on a little jog, and check out the fancy multi-colored map afterward!

23_multicolor_polyline

Leaving a Trail Of Breadcrumbs

That post-run map is stunning, but how about one during the run? The Breadcrumb sample project from Apple has this functionality, but as of the writing of this article, it uses some deprecated methods from pre-iOS 7.

Open Main.storyboard and find the ‘New Run’ view controller. Drag in a new MKMapView:

37_map_new_run

Then open NewRunViewController.m and add this import at the top:

#import <MapKit/MapKit.h>

And add the MKMapViewDelegate protocol conformation declaration to this line:

@interface NewRunViewController () <UIActionSheetDelegate, CLLocationManagerDelegate, MKMapViewDelegate>

Next, add an IBOutlet for the map to the class extension category:

@property (nonatomic, weak) IBOutlet MKMapView *mapView;

Then add this line to the end of viewWillAppear::

self.mapView.hidden = YES;

This makes sure that the map is hidden at first. Now add this line to the end of startPressed::

self.mapView.hidden = NO;

This makes the map appear when the run starts.

The trail is going to be another polyline, so it’s time to add your old friend, mapView:rendererForOverlay:. Add the following method:

- (MKOverlayRenderer *)mapView:(MKMapView *)mapView rendererForOverlay:(id < MKOverlay >)overlay
{
    if ([overlay isKindOfClass:[MKPolyline class]]) {
        MKPolyline *polyLine = (MKPolyline *)overlay;
        MKPolylineRenderer *aRenderer = [[MKPolylineRenderer alloc] initWithPolyline:polyLine];
        aRenderer.strokeColor = [UIColor blueColor];
        aRenderer.lineWidth = 3;
        return aRenderer;
    }
    return nil;
}

This version is similar to the one for the run details screen, except that the strokeColor is always blue here.

Next, you need to write the code to update the map region and draw the polyline every time a valid location is found. Find your current implementation of locationManager:didUpdateLocations: and update it to this:

- (void)locationManager:(CLLocationManager *)manager
     didUpdateLocations:(NSArray *)locations
{
    for (CLLocation *newLocation in locations) {
 
        NSDate *eventDate = newLocation.timestamp;
 
        NSTimeInterval howRecent = [eventDate timeIntervalSinceNow];
 
        if (abs(howRecent) < 10.0 && newLocation.horizontalAccuracy < 20) {
 
            // update distance
            if (self.locations.count > 0) {
                self.distance += [newLocation distanceFromLocation:self.locations.lastObject];
 
                CLLocationCoordinate2D coords[2];
                coords[0] = ((CLLocation *)self.locations.lastObject).coordinate;
                coords[1] = newLocation.coordinate;
 
                MKCoordinateRegion region =
                MKCoordinateRegionMakeWithDistance(newLocation.coordinate, 500, 500);
                [self.mapView setRegion:region animated:YES];
 
                [self.mapView addOverlay:[MKPolyline polylineWithCoordinates:coords count:2]];
            }
 
            [self.locations addObject:newLocation];
        }
    }
}

Now, the map always centers on the most recent location, and constantly adds little blue polylines to show the user’s trail thus far.

Open Main.storyboard and find the ‘New Run’ view controller. Connect the outlet for mapView to the map view, and set the mapView’s delegate to the view controller.

Build & run, and start a new run. You’ll see the map updating in real-time!

38_new_breadcrumbs

Where To Go From Here

Great job! There are few cool ways to go from here:

  • Use the altitude information from the location updates in NewRunController to figure out how hilly the route is.
  • If you’re up for a pure-math challenge, try blending the segment colors more smoothly by averaging a segment’s speed with that of the segment before it.

Stay tuned for part two of the tutorial, where you’ll introduce a badge system personalized for each user.

As always, feel free to post comments and questions!

How To Make an App Like RunKeeper: Part 1 is a post from: Ray Wenderlich

The post How To Make an App Like RunKeeper: Part 1 appeared first on Ray Wenderlich.

Viewing all 4373 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>