Quantcast
Channel: Kodeco | High quality programming tutorials: iOS, Android, Swift, Kotlin, Unity, and more
Viewing all 4370 articles
Browse latest View live

AsyncDisplayKit with René Cacheaux – Podcast S04 E07

$
0
0
Learn about Facebook's AsyncDisplayKit with Mic, Jake, and René!

Learn about Facebook’s AsyncDisplayKit with Mic, Jake, and René!

Learn about Facebook’s AsyncDisplayKit with Mic, Jake, and René!

[Subscribe in iTunes] [RSS Feed]

Our Sponsor

Interested in sponsoring a podcast episode? We sell ads via Syndicate Ads, check it out!

Links

Contact Us

Where To Go From Here?

We hope you enjoyed this episode of our podcast. Be sure to subscribe in iTunes to get notified when the next episode comes out.

We’d love to hear what you think about the podcast, and any suggestions on what you’d like to hear next season. Feel free to drop a comment here, or email us anytime at podcast@raywenderlich.com.

The post AsyncDisplayKit with René Cacheaux – Podcast S04 E07 appeared first on Ray Wenderlich.


Video Tutorial: Introducing Stack Views Part 4: Animating Stack Views

Video Tutorial: Introducing Stack Views Part 5: UIStackView In Code

Video Tutorial: Introducing Stack Views Part 6: Conclusion

RWDevCon Inspiration Talk – NSBrief by Saul Mora

$
0
0

Note from Ray: At our recent RWDevCon tutorial conference, in addition to hands-on tutorials, we also had a number of “inspiration talks” – non-technical talks with the goal of giving you a new idea, some battle-won advice, and leaving you excited and energized.

We recorded these talks so that you can enjoy them, even if you didn’t get to attend the conference. Here’s our next talk – NSBrief by Saul Mora – I hope you enjoy!


Transcript

Hey everyone thanks again for joining me for another NSBrief. How many of you here have heard me say that on the podcast? Maybe a couple of you here? Alright, I’ve got a few listeners in the audience.

For those of you that might not be familiar with NSBrief the podcast, it’s a pretty popular podcast on the internet these days. We’ve got a nice little website here that’s fairly easy to read by now.

We’ve got global reach. We touched a lot of different countries.

Screen Shot 2015-07-14 at 8.47.00 PM

This kind of change definitely from month to month and this particular month, I think, China was scraping a little bit of extra sites than normal this time.

Over the years, NSBrief has definitely grown from a little hobby of mine to something that is actually fairly popular and fairly well respected in the community, in the IOS community. It’s been really nice. This is a number of our total downloads just for the last couple of years:

TotalDownloads

It’s a lot of people listening to stuff that I say and my guests say, so that’s kind of nice. It’s nice that people actually hear me talk. NSBrief does get popular on iTunes. You can see me up there in the “What’s Hot” category.

Screen Shot 2015-07-14 at 8.48.51 PM

It’s amazing to be among all these other shows that I know have far higher production value than what I’m able to put in to NSBrief, again just as a hobby.

Now I took this one a few days ago.

Screen Shot 2015-07-14 at 8.49.05 PM

I looked at the software category in the How To section and I looked at just the top podcast, if you notice there at the bottom, I’m ahead of some other podcasts you might not have heard of (hehe). So yeah that just happens to be that way.

It’s a lot of fun, but one of the things that I was really proud of (and I don’t really cruise iTunes for podcasts all that often) was that last year in December there was this whole Hour Of Code thing going on. iTunes had a whole section about the Hour of Code and in the podcasts towards the bottom there’s a lot of really awesome podcasts, and NSBrief is included. This is definitely an honor to be in really good company with all these other really great podcasters.

The thing that I’m really proud of, as a producer of show and content that people really like to listen to, is the star rating. You can see here, this is an aggregation of all the iTunes stores all over the world.

Screen Shot 2015-07-14 at 8.52.36 PM

They’ve got 103 users rating stuff. I got a 4.9 star rating on the iTunes app Store, I got a lot of really great reviews. It’s really nice to know that people really like this show, so if you haven’t listened to it, I guess this is your sales pitch to go give it a try. It’s really helpful and insightful.

My Start In College Radio

How did I get started with all this craziness with talking to people and starting a podcast? It turns out that when I was in college, I actually was on this student radio station. I did a little bit of crappy DJ-ing as you do in college, and did a little bit of sportscasting going to some of the sports games. That was a lot of fun.

I’ll let you know though, this was towards the end of the dot com 1.0 bubble. There was just a lot of new crazy web technology and our student radio station, we only had 1 kilowatt radio, so we had to extend our reach. We were so advanced that we had our own RealAudio server, so we could expand beyond our campus just a little bit.

Deciding On A Name

So podcasting. What do you need to get started with a podcast? The one thing that you really need is a good name.

How do you name a podcast? Especially, now that you know NSBrief, how do I come up with these crazy names?

At the time I was listening to a lot of podcasts who were really popular back in the day.

Screen Shot 2015-07-14 at 9.16.07 PM

Late Night Cocoa was a really influential podcast for me. It really got me interested in iOS (it was iPhone OS development at the time and Cocoa development) and hearing Scotty just doing what he does, talking to people and asking questions was cool. That was a really cool name, but that was taken, scratch that one.

Core Intuition was another popular podcast at the time. I remember listening to the first episodes of that one and hearing Manton and Daniel just talking about the things that they do. I thought, “Well Core Intution is kind of techy and nerdy and stuff, so why don’t we follow that vein and call it CFPodcast?”

It was a little too niche. As you can see the reaction right here. There’s nobody laughing at that one. A little bit better, a little higher level abstractions might be in NSPodcast.

I was also listening to a lot of rails stuff at the time and I had come off of a lot of rails development. Rails Envy was a fairly popular podcast. I thought, “Well maybe I could do that.” That didn’t really fly.

It turned out CocoaRadio was actually already taken by the time I had decided to do a podcast. Justin Williams actually kind of stole it from somebody else. It was kind of weird.

I thought, “Well it’s going to be a technical podcast, so I should really make a technical name right?” So I came up with “CocoaBytes” but it had the connotation that Cocoa sucks, if you said it wrong and didn’t see the spelling, so that was kind of not good.

I took a step back and kind of wondered. I had this description. It was me and a friend of mine that actually thought we should do a podcast. It was at a conference in San Diego in 2010 and we were like, “Well we just want to do a short podcast, very brief podcast talking about really technical bits.” Then as we were reiterating what we wanted the goal of this podcast to be, this word “brief” stuck out to us.

It stuck out to me anyway. I was like, “We should do this and call it NSBrief.” It’s great because Brief has two connotations, because if we actually don’t succeed on the whole short thing we can just say, “We’re just being informative.” Hence NSBrief was the chosen name of the podcast.

Can We Actually Use This Name?

However, there was a dilemma. I don’t know if you ever followed a lot of the app store mishaps in the early days where apps were rejected for almost no reason or random reasons or it’s really unclear. Although that really hasn’t gone away has it?

Screen Shot 2015-07-14 at 10.07.13 PM

Apps were having a hard time with being experimental and such back in the day. One such app was named Briefs.

Does everybody remember this app? It’s by Rob Rhyne. Me being a conscientious, somebody that’s trying to get on to the scene, just trying to cover my bases I guess and such, I was like, “Brief and Briefs are too close to really be synonymous in the same field, so I should really just ask the author of this app for permission.”

I sent them an email, I tracked them down. I didn’t know who it was and I found his blog and I’m like, “Hey Rob, I’m going to start this podcast called NSBrief and I noticed your app is Briefs. Are you okay if I called my podcast this?”

He’s like, “Sure. Thumbs up, go right ahead.”

I’m like, “Sweet.” Then I got my thinking hat on and realized, “Hey wait do you want to be a guest on this podcast I’m just starting? I haven’t started it yet.”

Rob Rhyne, CEO of Martian Craft back then and now, was the very first guest of NSBrief. It was a very crude interview. It was done over a telephone line. We had something that would actually record the telephone conversation. The audio quality is horrible if you go back to it. You can go Nsbrief.com and look at the very first episode.

We talk about his adventures through getting Briefs in to the app store, which basically never really succeeded. The one that’s out there now is a completely different version than it was originally intended.

Evolution of Sound

I use Garage Band which is a very simple, free way to edit audio and put stuff together and I still use the Garage Band to edit NSBrief. I also started with a fairly cheap microphone.

Screen Shot 2015-07-14 at 9.18.03 PM

This was maybe $40 – $50, the quality wasn’t so great, but it was alright. It made it through maybe a few episodes, maybe 10 episodes. I would talk to some friends of mine at conferences and I would actually even record with my iPhone. I get the microphone app and just record, interview and do this and it was a little hokey but I wasn’t sure I was invested in this thing. I just wanted to try it out, do it on the cheap and see what I could do. Just Garage Band and whatever I had with me.

Hey it worked; I got a few episodes out of it, but then I was like, “I need to make a breakthrough.” I need to do more because it wasn’t really taking off like I’d hoped.

The thing is on one New Year’s Eve. I made a stupid resolution. I bet myself that I could post a new episode once a week for a year. Does anybody make New Years Resolutions? Who does crap like that?

I did and it worked. Challenge accepted!

001_ChallengeAccepted

It took a little while, but things kept going. These are my original stats from when I hosted the feed on feed burner.

Screen Shot 2015-07-14 at 9.19.35 PM

That very first number down there is not quite zero, it’s just 20. Only 20 people have heard that first episode back in the beginning, but it slowly crept up and it crept up and it turned out that the more episodes I produced more frequently, the more listeners I gained and got a lot more feedback over time too.

I eventually invested in a little bit more high quality gear. I’ve got the microphone and I’ve got the actually arm stands so it looked like an actual radio person.

Screen Shot 2015-07-14 at 10.10.43 PM

I got a little soundboard down there. That’s all my test devices if you’re really interested.

One of the things that was really useful was this portable voice recorder.

Screen Shot 2015-07-14 at 9.20.41 PM

I’ve actually still got it here. This was really useful, because it much higher quality. It’s also in stereo, where as if you just use the iPhone microphone it’s only just a mono audio feed. Instead you can take this and just have people voice in stereo, have it sound a little bit more comfortable to your ears.

I would go to meetups and things and asks friends, “Hey you want to do this podcast?” For this one with Tom Harrington what I did was I just said, “Hey let’s talk about Core Data.”

Screen Shot 2015-07-14 at 9.21.54 PM

What were talking about here actually is we’re just going through some Stack Overflow posts and answering random questions. You can go in the show notes and see that episode and see the questions that we asked and see how we answered them

You can also see down there at the bottom how crude my set up was. All I did was just talk normally and the voice recorder was really down there, way down there in the bottom. Now that can come back to haunt me later.

Needless to say I have improved my technique a little bit. I’ve got some better equipment, some better microphones. Well that’s what happens, you get better and better at your craft and you have better tools to do a better job.

Lesson Learned #1: Audio Quality Is Important

I’ve made my fair share of mistakes producing NSBrief. One of them is actually producing the audio. Having not done audio or been an audio engineer or anything like that in college or anywhere else before, I was basically learning on the fly. That didn’t really turn out well in the early days.

It turned out later that I actually asked for some help and got a lot of help from the community to help me learn how to do audio and make it so that people could actually hear the podcast and hear the content that they wanted to get to so much.

That was nice to at least to get some feedback on that, but I only got a 2 star rating for that. It’s still up there. It’s kind of sad, but moving on.

Lesson Learned #2: Website Readability Is Important

Another mistake that I made was putting the original site on Tumblr. Anybody here ever used Tumblr? Yeah see that’s why it was a big mistake. Tumblr was okay for a while. At the time it actually went down a lot, like everyday it went down. It was really horrible.

It was easy to use and it looked decent. Here’s our very first episode page on Tumblr. You can actually go see this if you go to NSBrief.tumblr.com. It’s still up there. I have not taken it down.

Screen Shot 2015-07-14 at 9.22.54 PM

You can see:

  • My cubed logo is there.
  • I have this “Ask Me Anything” section (I have no idea what that does).
  • You can subscribe in iTunes.
  • There’s a lot of words.
  • It’s very busy.

It’s a Tumblr site. We don’t really have a whole lot of control.

Moving on I thought, “Well I got to find a better way to host these things. This site has to stay up a little bit more.” Posterous, that was kind of like Tumblr, but well we know what happened to Posterous. That got bought out and closed, but luckily the internet saved me for this talk because I have a screen shot of my site.

Screen Shot 2015-07-14 at 9.24.00 PM

It turns out that back in the early days, I found this guy who was living in his mother-in-law’s basement. He actually knew stuff about Cocos2D. We had an interview, I actually remembered this interview because we were talking about Cocos2D. I was still really curious about it at the time.

I remember half way through the interview I was like, “Hey Ray I don’t have any other questions. Tell me what to ask you because I don’t know.” Basically I didn’t do my homework. The interview turned out well. It was very useful, but it was one of those things that was kind of odd.

Moving on, we moved over to Amazon and have a decent website for a developer. I don’t have the design chops that Ray does and all that. I had this put together from a template.

Screen Shot 2015-07-14 at 10.18.28 PM

It’s a little better, but there’s room for improvement. You can also see this, if you like, it’s on Amazon S3.

Now everything is hosted on the Mac mini in Las Vegas. I just figured I would bite the bullet, host it myself, do all of that, just take care of that. That has its pros and cons, but what it let’s me do is I can do my own thing with the website and have it posted here. What’s nice about this website is that it’s obviously a lot cleaner. It’s much more easy to read, and it’s mobile friendly!

I thought it was very ironic that I was doing an iOS focused podcast and the website had not been very mobile friendly. Tumblr and Posterous and that template that I had were very had to read on iOS devices. This site was easy to read on iOS

The site was actually contributed by somebody from the community. Community has always been a big part of NSBrief. I put a call out to say, “Hey is there anybody out there that wants to redesign the website.” It turns out that Tom Diggle wanted to do that. He did a really good job of putting together a really good website that I still use all the time.

Meeting Awesome Peeps

It turns out over the time of meeting people and talking to people about their talks at conferences like this one with Jay Trash.

Screen Shot 2015-07-14 at 9.26.38 PM

I made a lot of friends. I met a lot of people this way. It’s really nice to talk and converse and just make a lot of new friends this way. It’s an interesting way to have people remember you because if there’s anything that they’re going to remember is that you interviewed them for a podcast.

I also made a few enemies. Some of you out there know that Ash Furrow is my Twitter nemesis. He’s been on the podcast for a couple of times and we did have a battle in Amsterdam.

Screen Shot 2015-07-14 at 10.23.02 PM

I think I won. Go America!

You can check out those episodes:

The last one is actually the Swift grudge match, so you want to check that one out.

I’ve had my share of early exclusives as a podcaster. Eloy Duran I met up with out at second conference in Chicago when he announced CocoaPods. I interviewed him, he was a core team of member on the MacRuby Project that was running at the time and I wanted to talk about that, but he was also announcing CocoaPods. It’s the last thing. I thought at the time MacRuby was much more important than this CocoaPods thing; who cares about CocoaPods! That podcast is on there. It’s interesting:

I also got to meet Laurent Sansonetti the developer of MacRuby at the time, but now he does RubyMotion. I met up with him in Belgium and got a little exclusive and sneak peek before that was announced.

I also talked to the Appsterdam guys before hand and got the down low on what was going on in Amsterdam and what they were trying to build out there.

It just had a really big affect on my life and that I took inspiration from this and went out to Amsterdam for a couple of months and met a lot of people there and made a lot of great friends. It changed the trajectory of my career a bit.

While I was out there, I was able to meet Scotty, the guy from Late Night Cocoa. I went to visit him at his place in Tetbury, England. If there’s any place that’s out in the middle of nowhere in England, it’s Tetbury, but the curry is really good. That’s a plus.

Screen Shot 2015-07-14 at 9.32.36 PM

Scotty has been a regular in the podcast and he’s a really good friend of the show.

I do a lot of interviews of just people from the community. Here’s one with Ariel Michaeli.

Screen Shot 2015-07-14 at 9.33.21 PM

He’s based in New York and he runs appFigures. He’s another one of these guys that started up as a one man shop. I think his was actually two men, him and his brother. They started building not an app agency, but just an app analytics agency where they would analyze all of your iTunes connect data.

I’ve talked to a lot of fun people; they have all been really great. But I have to say that out of all the interviews I have done, one that I remember the most is the one I did on an airplane.

Screen Shot 2015-07-14 at 9.34.23 PM

I met Jaime Newbury out at WWC on the way back and we just did an interview there. We talked about her work, her design work. We also talked about women in tech, since that’s a really big issue. That’s the interview you’re not going to forget any time soon.

Brief Plans

After all of this, what’s next?

It’s been a while, as mentioned in my introduction, that I’ve been working at Coursera now.

I’m a full time employee, so it’s been a little bit more difficult to keep up the weekly cadence of producing a podcast episode every week. I’ve been trying to reflect on what’s good for the community and when I get reviews like this,

Screen Shot 2015-07-14 at 9.35.12 PM

…it makes me think twice about canceling the show. It’s not really good for the community and it’s not good for me.

Again I was turning to the community and trying to think, “What can I do to keep this going without putting too much of a burden on myself?”

The one thing that I thought was, “I don’t have to do this all alone. This is not something I have to do by myself.” With that I’m announcing that I’m stepping down as primary host of NSBrief. We’re actually going to have a new host.

Janie Clayton is going to be the new host of NSBrief!

Screen Shot 2015-07-14 at 9.36.47 PM

Now that doesn’t mean that things are going to change all that much. She’s still going to do interviews, she’s still going to meet people, she’s still going to do a bang up job.

For one she’s a lot more technically qualified than I am. She is a journalism major. She went to school to do the things that I already messed up on. She knows how to run the boards, she knows how to interview people. She has that experience.

She’s also a developer, she a really hard core developer. She’s also a little fresh around the ears, so she’s got a lot of questions that she wants to ask. She reminds me of myself way back when I first started NSBrief.

The elephant’s in the room is that she’s a woman and that’s great. I love that. Janie is certainly a talented developer and a talented individual such that she could do this all by herself. I felt that giving her this opportunity to take over for me and take over as host of NSBrief will help her reach her goals a lot faster.

I’m really looking forward to working with Janie and having her be a key contributor to the community.

With that I think my time today is about up so I will say I hope to maybe interview some of you for the podcast, and thanks for listening.

Note from Ray: Did you enjoy this inspiration talk? If so, sign up to our RWDevCon newsletter to be notified when RWDevCon 2016 tickets go on sale!

The post RWDevCon Inspiration Talk – NSBrief by Saul Mora appeared first on Ray Wenderlich.

Sprite Kit Tutorial: Drag and Drop Sprites

$
0
0
Drag and drop these cute pets with Sprite Kit!

Drag and drop these cute pets with Sprite Kit!

Update note: Team member Riccardo D’Antoni has updated this tutorial originally by Ray Wenderlich (Cocos-2D version) and update by Jean-Pierre Distler (Objective-C version). It’s now fully up-to-date for iOS 8 and Swift!

Handling touches and swipes is right at the core of most apps – after all, it’s how your users interact with the user interface!

In Sprite Kit, there’s no UIButton equivalent to handle simple taps, and you need to implement touch handling yourself with callbacks or with gesture recognizers.

Not to worry though; in this tutorial, you’re going to learn all about handling touches and drag-and-drop:

  • The basics of dragging and dropping sprites with touches
  • How to scroll the view itself via touches
  • How to use gesture recognizers with Sprite Kit for even more cool effects!

To make things fun, you’ll be moving some cute animals from a Game Art Guppy art pack, on a background made by gwebstock.

This tutorial assumes you at least know the basics of Sprite Kit. If you are completely new to Sprite Kit, be sure to check out our Sprite Kit Swift Tutorial for Beginners first.

So without further ado, drag your fingers over to the keyboard and let’s get started!

Getting Started

Before you implement the touch handling, first you’ll create a basic Sprite Kit scene displaying the sprites and artwork.

Open up Xcode, go to File\New Project, choose the iOS\Application\Game template, and click Next.

1

Name the project DragDropSwift, select Swift as the Language, SpriteKit as the Game Technology, and iPhone for Devices. Click Next to continue, and select a location to save the project.

2

For layout simplicity, you’ll limit the app to landscape only. Select your DragDropSwift project in the Project Navigator, select your DragDropSwift target, and make sure only Landscape Left and Landscape Right are checked.

3

Open GameViewController.swift and delete the SKNode extension as well as viewDidLoad(). You won’t be using the scene from the scene editor, but will set up things from code instead.

Add the following method to GameViewController:

override func viewWillLayoutSubviews() { 
  // Configure the view.
  let skView = self.view as! SKView
  skView.showsFPS = true
  skView.showsNodeCount = true
 
  /* Sprite Kit applies additional optimizations to improve rendering performance */
  skView.ignoresSiblingOrder = true
 
  let scene = GameScene(size: skView.frame.size)
 
  /* Set the scale mode to scale to fit the window */
  scene.scaleMode = .AspectFill
 
  skView.presentScene(scene)
}

This is the standard Sprite Kit boilerplate to display the starter scene.

Open Images.xcassets. You can delete the starter Spaceship image since you’ll be dealing with friendly animals and not spaceships for this tutorial!

Next, download the resources you’ll need for this tutorial. Once you download and unzip the file, drag all of the images into the asset catalog.

dragdropsprites-assets

Once you’ve added the images to your project, open GameScene.swift and replace the entire contents of the file with the following:

import SpriteKit
 
private let kAnimalNodeName = "movable"
 
class GameScene: SKScene {
  let background = SKSpriteNode(imageNamed: "blue-shooting-stars")
  var selectedNode = SKSpriteNode()
}

You don’t need the starter “Hello World” scene; instead, you have a constant that you’ll use later to track the currently moving sprite, and two properties to keep track of the background and image sprites.

Next, add the following initializers to the class:

required init?(coder aDecoder: NSCoder) {
  fatalError("init(coder:) has not been implemented")
}
 
override init(size: CGSize) {
  super.init(size: size)
 
  // 1
  self.background.name = "background"
  self.background.anchorPoint = CGPointZero
  // 2
  self.addChild(background)
 
  // 3
  let imageNames = ["bird", "cat", "dog", "turtle"]
 
  for i in 0..<imageNames.count {
    let imageName = imageNames[i]
 
    let sprite = SKSpriteNode(imageNamed: imageName)
    sprite.name = kAnimalNodeName
 
    let offsetFraction = (CGFloat(i) + 1.0)/(CGFloat(imageNames.count) + 1.0)
 
    sprite.position = CGPoint(x: size.width * offsetFraction, y: size.height / 2)
 
    background.addChild(sprite)
  }
}

The init(coder:) initializer is required by the compiler, but you won’t actually use it. The real logic is in the overridden init(size:); let’s go over it step by step.

  1. First, you set up the background for the scene by giving a name that will be used in the app logic later on. Next, you set the anchor poin of the image to the lower left of the image, at (0, 0). The anchor point in Sprite Kit is at the center of the node by default, which means you’re usually setting the center point when you set a node’s position. By setting the anchor point to the lower left corner, when you set the position of the node, you are now setting where the lower left corner is.
  2. You’re not setting the position of the background but just adding it to the node hierarchy. That means the position of the background defaults to (0,0). Hence, the lower left corner of the image is located at (0,0), and so the image (which is about 800 points long) extends off screen to the right.
  3. The next part of the method loops through the list of images to load. For each image, you create a node and place it on the scene. The nodes are distributed along the length of the screen, to have a nice initial layout. Setting the name of a node avoids the need of holding a reference to a node, in case you need it later.

That’s it! Build and run your code, and you should see some cute animals sitting there, just begging to be touched!

dragdropsprites-animals

Selecting Sprites based on Touches

Now you’ll work on the code to determine which sprite is selected based on the user’s current touch.

Add the implementation of touchesBegan(_:withEvent:) to start handling taps:

override func touchesBegan(touches: NSSet, withEvent event: UIEvent) {
  let touch = touches.anyObject() as UITouch
  let positionInScene = touch.locationInNode(self)
 
  selectNodeForTouch(positionInScene) 
}

First you get the touch from the set of touches, and convert the touch coordinates to the coordinate system of the scene. With this position, you call selectNodeForTouch(_:), which you’ll implement next to select one of the animals.

Add the following helper methods to the class next:

func degToRad(degree: Double) -> CGFloat {
  return CGFloat(Double(degree) / 180.0 * M_PI)
}
 
func selectNodeForTouch(touchLocation: CGPoint) {
  // 1
  let touchedNode = self.nodeAtPoint(touchLocation)
 
  if touchedNode is SKSpriteNode {
    // 2
    if !selectedNode.isEqual(touchedNode) {
      selectedNode.removeAllActions()
      selectedNode.runAction(SKAction.rotateToAngle(0.0, duration: 0.1))
 
      selectedNode = touchedNode as! SKSpriteNode
 
      // 3
      if touchedNode.name! == kAnimalNodeName {
        let sequence = SKAction.sequence([SKAction.rotateByAngle(degToRad(-4.0), duration: 0.1),
          SKAction.rotateByAngle(0.0, duration: 0.1),
          SKAction.rotateByAngle(degToRad(4.0), duration: 0.1)])
        selectedNode.runAction(SKAction.repeatActionForever(sequence))
      }
    }
  }
}

Sprite Kit uses radians for rotation so the first method angle given in degree to radians.

selectNodeForTouch(_:) selects one of the animal sprites based on location, in three steps:

  1. First, it finds the node at the touchLocation.
  2. If the node found is a SKSpriteNode instance, you first check if the node is the same as the previously selected node. In this case there is nothing to do and the method returns early. If this is a freshly selected node, then you first reset it by removing all actions on the node and then setting it back to its original, unrotated state.
  3. This if-statement checks if the selected node is one of the animatable animal nodes by checking the name property that you set. If so, you create a sequence of actions for a “wiggle” animation, like the one on the home screen when rearranging/deleting apps, and then run this sequence on the selected node. To keep the level of excitement high, you run it as an action that is repeated forever.

Build and run your code, and you should now be able to tap on the animals. When you tap them they should wiggle in a particularly cute way to show that they are selected!

Sprites wiggling indicating selection

Moving Sprites and the Layer based on Touches

Time to make these animals move! The basic idea is you’ll implement touchesMoved(_:withEvent:), and figure out how much the touch has moved since last time. If an animal is selected, it will move the animal by that amount. If an animal is not selected, it will move the entire layer instead, so that the user can scroll the layer from left to right.

To understand how you can scroll a node in Sprite Kit, start by taking a look at the image below:

Scrolling layers with Cocos2D

As you can see, you’ve set up the background so the anchor point (the lower left) is at (0, 0), and the rest extends off to the right. The black area indicates the current visible area (the size of the window).

If you want to scroll the image 100 points to the right, you can do that by moving the entire Sprite Kit node 100 points to the left, as you can see in the second image.

You also want to make sure you don’t scroll too far. For example, you shouldn’t be able to move the layer to the right, since there would be a blank space where the background doesn’t cover.

Now that you’re armed with this background information, let’s see what it looks like in code! Still in GameScene.swift, add the following new methods to the class:

func boundLayerPos(aNewPosition: CGPoint) -> CGPoint {
  let winSize = self.size
  var retval = aNewPosition
  retval.x = CGFloat(min(retval.x, 0))
  retval.x = CGFloat(max(retval.x, -(background.size.width) + winSize.width))
  retval.y = self.position.y
 
  return retval
}
 
func panForTranslation(translation: CGPoint) {
  let position = selectedNode.position
 
  if selectedNode.name! == kAnimalNodeName {
    selectedNode.position = CGPoint(x: position.x + translation.x, y: position.y + translation.y)
  } else {
    let aNewPosition = CGPoint(x: position.x + translation.x, y: position.y + translation.y)
    background.position = self.boundLayerPos(aNewPosition)
  }
}

The first method boundLayerPos(_:) is used for making sure you don’t scroll the layer beyond the bounds of the background image. You pass in the coordinates of where you’d like to move the layer, and it returns a possibly modified point to make sure you don’t scroll too far.

The next method panForTranslation(_:) first checks if selectedNode is an animal node and sets the position based on a passed-in translation. If the selected node is the background layer it sets calls boundLayerPos(_:) in addition to setting the position to make sure that you cannot scroll too far to the left or right.

Now you can implement touchesMoved(_:withEvent:) to start handling pans:

override func touchesMoved(touches: NSSet, withEvent event: UIEvent) {
  let touch = touches.anyObject() as UITouch
  let positionInScene = touch.locationInNode(self)
  let previousPosition = touch.previousLocationInNode(self)
  let translation = CGPoint(x: positionInScene.x - previousPosition.x, y: positionInScene.y - previousPosition.y)
 
  panForTranslation(translation)
}

Like you did in touchesBegan(_:withEvent:) you first get the touch and convert its position to the position in your scene. To calculate the translation, or how far you’ve dragged your finger on the screen, you need to start with the previous location of the touch.

With the current and previous location you create the translation by subtracting the current location from the last one. Finally you call panForTransaltion(_:) with the calculated translation to handle the scrolling.

Give it a shot – build and run your code, and you should now be able to move the sprites and the layer by dragging!

Dragging sprites with touch with Cocos2D

How to Use Gesture Recognizers with Sprite Kit

There’s another way to accomplish what you just did with Sprite Kit touch handling – use gesture recognizers instead!

Gesture recognizers are a great way to detect different gestures like taps, double taps, swipes or pans. Instead of implementing the touch-handling methods yourself and try to distinguish between taps, double taps, swipes, pans and pinches, you simply create a gesture recognizer object for what you want to detect, and add it to the view.

They are extremely easy to use, and you can use them with Sprite Kit with no troubles. Let’s see how that works.

First, comment out the touch handling methods, touchesBegan(_:withEvent:) and touchesMoved(_:withEvent:) since you will be using a different method now.

Next, add the following method to the class:

override func didMoveToView(view: SKView) {
  let gestureRecognizer = UIPanGestureRecognizer(target: self, action: Selector("handlePanFrom:"))
  self.view!.addGestureRecognizer(gestureRecognizer)
}

This method gets called when the scene is first presented. Here, you create a pan gesture recognizer and initialize it with your scene as the target and handlePanFrom: as the callback method. Finally, you add the gesture recognizer to your scene’s presenting view.

Note: You may ask yourself why recognizer is added here and not in the initializer. SKScene has a view property that holds the SKView that is presenting the scene, but unfortunately this property is first set when the scene is presented on the screen. That means the property is nil while the object is initializing. didMoveToView(_:) is like viewDidAppear(_:) in UIKit views, and gets called after your scene is presented.

Next, add the following to the class:

func handlePanFrom(recognizer: UIPanGestureRecognizer) {
  if recognizer.state == .Began {
    var touchLocation = recognizer.locationInView(recognizer.view)
    touchLocation = self.convertPointFromView(touchLocation)
 
    self.selectNodeForTouch(touchLocation)
  } else if recognizer.state == .Changed {
    var translation = recognizer.translationInView(recognizer.view!)
    translation = CGPoint(x: translation.x, y: -translation.y)
 
    self.panForTranslation(translation)
 
    recognizer.setTranslation(CGPointZero, inView: recognizer.view)
  } else if recognizer.state == .Ended {
    if selectedNode.name != kAnimalNodeName {
      let scrollDuration = 0.2
      let velocity = recognizer.velocityInView(recognizer.view)
      let pos = selectedNode.position
 
      // This just multiplies your velocity with the scroll duration.
      let p = CGPoint(x: velocity.x * CGFloat(scrollDuration), y: velocity.y * CGFloat(scrollDuration))
 
      var newPos = CGPoint(x: pos.x + p.x, y: pos.y + p.y)
      newPos = self.boundLayerPos(newPos)
      selectedNode.removeAllActions()
 
      let moveTo = SKAction.moveTo(newPos, duration: scrollDuration)
      moveTo.timingMode = .EaseOut
      selectedNode.runAction(moveTo)
    }
  }
}

This callback gets called when the pan gesture begins, changes (i.e the user continues to drag), and ends. The method switches on each case, and does the appropriate action.

When the gesture begins, it converts the coordinates to node coordinates (note it has to do it the long way because there’s no shortcut method), and calls the selectNodeForTouch(_:) helper you wrote earlier to select a node.

When the gesture changes, it needs to figure out the amount the gesture moved. One of the nice things about gesture recognizers it actually stores for you the cumulative translation for the gesture so far! However, you have to reverse the y coordinate to take into effect the difference between UIKit coordinates and Sprite Kit coordinates.

After panning for the translation, it resets the translation on the recognizer to zero, because otherwise the translation is cumulative, and you just want the difference each time.

When the gesture ends, there’s some new and interesting code in here! Another cool thing a UIPanGestureRecognizer gives you is the velocity of the pan movement. You can use this to animate the background node to slide a bit, so the user can flick quickly to get the background to slide a bit, like you’re used to seeing in scroll views. Based on the velocity, you run a move action with easing to make it feel more natural.

Build and run the project, and you should now be able to slide and move around, all with gesture recognizers!

Moving sprites with UIGestureRecognizers with Cocos2D

Remember to try the velocity-powered movement for the background – try a small flick-like swipe from left to right on the background and you’ll see it continue to scroll a bit after you lift your finger.

Where To Go From Here?

You can download the final project with all of the code from this tutorial.

At this point you should know how to move nodes using touches in your Sprite Kit apps and games, and should know the basics of using gesture recognizers with Sprite Kit.

From here, you could try extending this project with other gesture recognizers, such as perhaps pinch or rotate gesture recognizers. Maybe you can make the cat grow!

If you want to learn more about Sprite Kit, you should check out our book iOS Games by Tutorials. We’ll teach you everything you need to know – from physics, to tile maps, to particle systems, and even making your own level editor.

If you have any questions or comments about this tutorial, please join the discussion below!

The post Sprite Kit Tutorial: Drag and Drop Sprites appeared first on Ray Wenderlich.

Video Tutorial: What’s New in watchOS 2: Series Introduction

Video Tutorial: What’s New in watchOS 2 Part 1: Pickers


Sprite Kit Tutorial: Create an Interactive Children’s Book with Sprite Kit and Swift

$
0
0
Learn how to create an interactive children's book for the iPad!

Learn how to create an interactive children’s book for the iPad!

Update note: Jorge Jordán has updated this children’s book tutorial to iOS 8 and Swift! The original post was by tutorial team member Tammy Coron.

With the iPad, it’s never been a better time to be a kid!

The iPad allows developers to create beautiful interactive children’s books that simply cannot be replicated in any other medium. For some examples, check out The Monster at the End of This Book, Bobo Explores Light, and Wild Fables.

In the old days, developers had to use third party frameworks like Cocos2d to get the job done. But now, Apple has provided their own 2D framework called Sprite Kit, which is perfect for creating these types of books.

In this tutorial, you’ll create an interactive children’s book called The Seasons using the Sprite Kit framework, where you’ll learn how to add objects to scenes, create animation sequences, allow the reader to interact with the book and even how to add sound and music to your book!

Note: The Seasons was written and illustrated by Tutorial Team member Tammy L Coron. This tutorial uses the first few pages from that book. Please do not reproduce any of its contents in your own projects. The narration is provided by Amy Tominac, and again cannot be used in your own projects.

This tutorial also uses music from Kevin MacLeod and sound effects from FreeSound. These are both great resources for your project, but you must provide attribution. See attribution.txt in the project bundle for more details.

This tutorial assumes you are familiar with at least the basics of Sprite Kit. If you are new to Sprite Kit, please check out our Sprite Kit Tutorial for Beginners first.

Getting Started

First download the starter project for this tutorial. Extract the project to a convenient location on your drive, then open it up in Xcode.

Like any good book, it’s best to start at the beginning – the title scene. The starter project provides stub versions of the methods used in this scene — it’s your job to add the code. Your first steps will be to initialize the scene and add a background image.

Open Scene00.swift and add the following block of code to didMoveToView(_:) just after the comment that reads /* add setup here */:

var background = SKSpriteNode(imageNamed: "bg_title_page")
background.anchorPoint = CGPoint(x: 0, y: 0)
background.position = .zeroPoint
 
addChild(background)

The code above creates an SKSpriteNode with the title page’s background. Both the anchor point and position are set to (0,0), which means the lower-left corner of the background will be at the lower-left corner of the screen. Finally, it calls addChild(_:) to add the newly created node to the scene.

Still working in the same file, add the following two lines of code to init(size:) immediately after the call to addChild(node:):

setUpBookTitle()
setUpFooter()

The two methods called here are only shells at the moment; you’ll flesh them out in the next section.

Build and run your project; you should see the title page as illustrated below:

tc_spritekit_build1

Adding the Book Title

As you saw in the code above, it’s a straightforward operation to add an SKSpriteNode to your scene. With just a few lines of code, you can add a textured sprite to any scene.

Now that the background image has been added, you’ll need to add the book title. You could use an SKLabelNode to add your title, or even a UIView, but instead I’ve opted to use a graphic.

Still working in the Scene00.swift file, add the following block of code to setUpBookTitle():

var bookTitle = SKSpriteNode(imageNamed: "title_text")
bookTitle.name = "bookTitle"
 
bookTitle.position = CGPoint(x: 421, y: 595)
addChild(bookTitle)

You first create an SKSpriteNode with the title text graphic. You also give the node a name “bookTitle” so you can access it later. You’re setting the position to an exact co-ordinate so it lines up precisely where it should against the background image.

Build and run your project; you’ll see your background image with the graphic title superimposed over it, as shown below:

tc_spritekit_build2

That looks pretty neat. But wouldn’t it be great if you could add a little bit of action to your title screen?

Adding Animation to Objects

Sprite Kit makes it easy to add animation to the objects in your scene. Instead of simply having the title appear on the screen, you can make it slide on to the screen and bounce a little bit before it comes to rest.

To do this, you’ll use SKAction objects with predefined sequences.

Head back to Scene00.swift and add the following code to the end of setUpBookTitle():

var actionMoveDown = SKAction.moveToY(600, duration: 3.0)
var actionMoveUp = SKAction.moveToY(603, duration: 0.25)
var actionMoveDownFast = SKAction.moveToY(600, duration: 0.25)
 
bookTitle.runAction(SKAction.sequence([actionMoveDown, actionMoveUp, actionMoveDownFast]]]

The above code creates three actions using the SKAction class method moveToY(_:, duration:); this specifies an action that moves a node vertically from its original position to the specified y position. Next, it creates a new sequence action using the SKAction class method sequence(_:); this instructs bookTitle to run the specified array of actions in order.

The net result of these actions is that the sprite will move the sprite down, then up, and then down again before it comes to rest.

Next, modify the line in setUpBookTitle() that sets bookTitle‘s position property as shown below:

bookTitle.position = CGPoint(x: 425, y: 900)

This simply modifies the start position of the title image so that it starts off-screen.

Build and run your project; the title now slides in slowly and bounces a little before it comes to rest at the position specified in actionMoveDownFast.

Actions are great, but sounds are an integral part of any interactive children’s book. Sprite Kit makes this tremendously easy as well!

Adding Sound to Your Story

There are several methods to add music and other sounds to your book, but in this tutorial you’ll just focus on two methods: one for adding the background music, and the other for adding the narration and sound effects.

Open SeasonsSceneBase.swift and add the following import statement:

import AVFoundation

AVFoundation is the iOS framework that will provide audio-visual capabilities to the app, and will be used here to play the story’s music and sound effects.

Add the following properties to the top of the SeasonsSceneBase class declaration:

private var backgroundMusicPlayer: AVAudioPlayer?
private var btnSound = SKSpriteNode(imageNamed: "button_sound_on")
private var soundOff: Bool

Next, add the following line of code before super.init() line in init(_:):

soundOff = NSUserDefaults.standardUserDefaults().boolForKey("pref_sound")

This code retrieves the Bool value that represents the user’s preference for having the sound on or off and assigns it to the soundOff variable.

Next, add the following code in Scene00.swift at the top of didMoveToView(_:):

playBackgroundMusic("title_bgMusic.mp3")

This will call into playBackgroundMusic(_:) to play the title song on this scene.

Now add the following block of code to playBackgroundMusic(_:):

var error: NSError?
let backgroundMusicURL = NSBundle.mainBundle().URLForResource(filename, withExtension: nil)
backgroundMusicPlayer = AVAudioPlayer(contentsOfURL: backgroundMusicURL, error:&error)
backgroundMusicPlayer.numberOfLoops = -1
backgroundMusicPlayer.volume = 1.0
backgroundMusicPlayer.prepareToPlay()

playBackgroundMusic(_:) plays the background music using an AVAudioPlayer object.

Note: AVAudioPlayer isn’t specific to Sprite Kit, so it won’t be covered in detail here. For more detail on AVAudioPlayer, check out our Audio Tutorial for iOS.

Okay — you have some code to handle the user preference for enabling or disabling sound in your book. You should probably add a control to let the user set this preference! :]

Still working in SeasonsSceneBase.swift, add the code below to setupFooter():

btnSound.zPosition = 1
 
if soundOff {
  let action = SKAction.setTexture(SKTexture(imageNamed: "button_sound_off"))
  btnSound.runAction(action)
 
  backgroundMusicPlayer?.stop()
} else {
  let action = SKAction.setTexture(SKTexture(imageNamed: "button_sound_on"))
  btnSound.runAction(action)
 
  backgroundMusicPlayer?.play()
}
 
if homeFooter {
  // Positions the sound button all the way to the right.
  btnSound.position = CGPoint(x: 980, y: 38)
} else {
  btnSound.position = CGPoint(x: self.size.width/2 + 330, y: 38)
}
addChild(btnSound)

The above code sets the texture of the btnSound SKSpriteNode to the appropriate image depending on the user’s stored preference and it also sets the button’s zPosition to be sure that it’s placed above the background’s zPosition. It then checks soundOff and updates the sprite’s texture with an SKAction to the appropriate on or off image. Then it sets the sprite’s position, adds it to the parent view, and finally plays or stops the music depending on the user’s preference.

There are a number of ways to handle changing a sprite’s image. The above method is only one way to accomplish this. The next section shows you another way to accomplish the same thing.

For now, build and run the app, and the background music should play. Tapping the button won’t do anything yet though.

Detecting Touch Events

The reason you can’t currently toggle the control is that the sprite doesn’t have any touch event handlers. Add the following code to touchesBegan(_:withEvent:) in SeasonsSceneBase.swift:

for touch in touches as! Set<UITouch> {
  var location: CGPoint = touch.locationInNode(self)
 
  if btnSound.containsPoint(location) {
     showSoundButtonForTogglePosition(soundOff)
  }
}

touchesBegan(_:withEvent:) is part of the UIResponder class which all SKNode objects extend. It tells the receiver when one or more fingers begin touching the view. In this method you’re able to capture the location of the touch and respond to it accordingly.

Note: As every SKNode extends UIResponder, you can implement touch handling directly in your sprites. This tutorial performs all touch event handling in the scene nodes simply to keep the number of classes more manageable. However, in a production app it would likely make more sense to create an SKSpriteNode subclass, such as SoundToggleButton, and have that subclass handle its own touch events.

Look at the if statement above; you’ll see that if the touch location is on the sound button, you call showSoundButtonForTogglePosition(_:).

Now would be a great time to implement that method! :] Add the following code to showSoundButtonForTogglePosition(_:) in SeasonsSceneBase.swift:

if togglePosition {
  let action = SKAction.setTexture(SKTexture(imageNamed: "button_sound_on"))
  btnSound.runAction(action)
 
  soundOff = false
  NSUserDefaults.standardUserDefaults().setBool(false, forKey: "pref_sound")
  NSUserDefaults.standardUserDefaults().synchronize()
 
  backgroundMusicPlayer?.play()
} else {
  let action = SKAction.setTexture(SKTexture(imageNamed: "button_sound_off"))
  btnSound.runAction(action)
 
  soundOff = true
  NSUserDefaults.standardUserDefaults().setBool(true, forKey: "pref_sound")
  NSUserDefaults.standardUserDefaults().synchronize()
 
  backgroundMusicPlayer?.stop()
}

The method above sets the appropriate texture for the SKSpriteNode that is held in btnSound — much as you did in setupFooter(). Once the texture has been set, you can store the user’s selection and either play or stop the music accordingly.

Note: An SKTexture object stores a reference to the graphical data in memory that a sprite renders.

Build and run, and now you should be able to tap the sound button to toggle the background music on and off.

Adding the Next Scene

Going back to Scene00.swift, add the following code to setUpBookTitle() just after the line that creates actionMoveDownFast, but before the line that calls runAction: on bookTitle:

var wait = SKAction.waitForDuration(0.75)
 
var showButton = SKAction.runBlock {
 
  var buttonStart = SKSpriteNode(imageNamed: "button_read")
  buttonStart.name = "buttonStart"
 
  buttonStart.position = CGPoint(x: 425,y: 460)
  buttonStart.zPosition = 1
 
  self.addChild(buttonStart)
 
  buttonStart.runAction(SKAction.playSoundFileNamed("thompsonman_pop.wav", waitForCompletion: false))
}

The code above creates a new action block which creates a sprite for the start button and plays a sound. By putting the code to create the sprite and play a sound inside an action, you can easily make this code run in a particular point in a sequence of actions.

As you may have figured out by now, there is usually more than one way to skin a cat — or code your children’s book!

Replace the line in setUpBookTitle() that calls runAction(_:) on bookTitle with the following code:

bookTitle.runAction(SKAction.sequence([actionMoveDown, actionMoveUp, actionMoveDownFast, wait, showButton]))

This simply adds the new actions to the sequence that bookTitle runs when called.

The next step is to modify the touch handling. Add the following line to the top of touchesBegan(_:withEvent:)

// Make sure the start button has been added.
if let startButton = childNodeWithName("buttonStart") {
 
  /* Called when a touch begins */
  for touch in touches {
    var location = touch.locationInNode(self)
 
    if startButton.containsPoint(location) {
      goToScene(Scene01(size: size))
    }
  }
}

This assigns the button – added earlier in the showButton action – to the startButton variable, and goes to Scene01 if the touch is located within the button..

Next, add the following code to goToScene(_:) in SeasonsSceneBase.swift

backgroundMusicPlayer?.stop()
 
var sceneTransition = SKTransition.fadeWithColor(UIColor.darkGrayColor(), duration: 1)
self.view?.presentScene(scene, transition: sceneTransition)

The above code adds the new scene and presents it with a new transition.

Build and run your project. After the initial animation, you should see the new “Read Story” button, as shown below:

tc_spritekit_build3

Tap the “Read Story” button and the next scene presents itself on-screen. You won’t see a whole lot at this point — your next task is add some content to the first page.

Adding Content to the First Page

In this section, you’ll add the first page to the book along with some physics simulation and some touch response logic.

For this purpose you will use all of the goodies in SKTUtils, which is an external resource developed for iOS Games by Tutorials and won’t be covered in detail in this tutorial.

Add the following properties to the class in SeasonsSceneBase.swift:

var footer = SKSpriteNode(imageNamed: "footer")
var btnLeft = SKSpriteNode(imageNamed: "button_left")
var btnRight = SKSpriteNode(imageNamed: "button_right")
Note: From this point forward, all code dealing with background sound has been included in the starter files for you. You can review the previous section to see how sound is implemented in this project.

Just as before, you’ll need to initialize the scene before you can do anything with it. There are several ways to accomplish this, but this tutorial takes the approach of creating separate methods for each scene instead of handling everything within a single method.

Add the following block of code to the bottom of didMoveToView(_:) in Scene01.swift:

setUpText()
setUpFooter()
setUpMainScene()

Those are your basic setup methods — your job is to fill them out. The good news is that the implementation of setUpText setUpFooter is identical for each scene in your project, so you’ll only need to write it once for all scenes.

Add the following code to setUpText in Scene01.swift:

var text = SKSpriteNode(imageNamed: "pg01_text")
text.position = CGPoint(x: 300 , y: 530)
addChild(text)
 
readText()

Here you create an SKSpriteNode, set its location, and add it to the scene. The sprite above holds an image for the text of the page; you’ll have one sprite for the text of each page in your book.

This method also calls readText() which plays the narration for the book.

Add the following code to readText() in Scene01.swift:

if actionForKey("readText") == nil {
  var readPause = SKAction.waitForDuration(0.25)
  var readText = SKAction.playSoundFileNamed("pg01.wav", waitForCompletion: true)
 
  var readSequence = SKAction.sequence([readPause, readText])
 
  runAction(readSequence, withKey: "readText")
} else {
  removeActionForKey("readText")
}

Here you create a new SKAction, much like you did earlier for the bouncing title text. The difference this time — and it’s an important difference — is that you also assign a key name of readText to your action. You’ll discover why a bit later.

You’re also using SKAction’s playSoundFileNamed(_:waitForCompletion:) class method, which is a really simple method for playing sounds. While this method works great for quick sound effects, it’s probably not the best choice for read-along text because you can’t interrupt the sound when it’s playing, among other reasons. You’re using it in this tutorial for ease-of-use only and to become familiar with the framework.

Build and run, and now you will see and hear the narrated text for page 1!

iOS Simulator Screen shot Jan 13, 2014, 2.42.55 PM

Adding the Navigation Controls

Next, you’ll add the navigation controls for your app, which will all be located along the bottom of the screen.

Add the following code above your btnSound setup code to setUpFooter() in SeasonsSceneBase.swift:

if !homeFooter {
  /* add the footer */
  footer.position = CGPoint(x: self.size.width/2, y: 38)
  btnRight.zPosition = 1
  addChild(footer)
}
 
physicsBody = SKPhysicsBody(edgeLoopFromRect: footer.frame)
 
/* add the right button if there is a next scene */
if getNextScene() != nil {
  btnRight.position = CGPoint(x: self.size.width/2 + 470, y: 38)
  btnRight.zPosition = 1
  addChild(btnRight)
}
 
/* add the left button if there is a previous scene */
if getPreviousScene() != nil {
  btnLeft.position = CGPoint(x: self.size.width/2 + 400, y: 38)
  btnLeft.zPosition = 1
  addChild(btnLeft)
}

The code above initializes footer with the footer area’s background image, sets its position, and adds it to the scene. It also adds sprites for both the page back and page forward buttons depending on if there is a “previous” or “next” scene.

Build and run, and you will now see the footer along the bottom of the scene (although tapping it won’t do anything yet):

Book footer

Now that the basic scene setup is complete, it’s time to add the main character.

Adding The Main Character

Still working in the Scene01.swift file, add the following property to Scene01:

private var kid = SKSpriteNode(imageNamed: "pg01_kid")

This variable holds a reference to your main character’s sprite.

Next, add the following two lines to setUpMainScene of Scene01.swift:

setUpMainCharacter()
setUpHat()

These methods simply call other methods to keep your code nice and tidy. You’ll populate those next.

Add the following code to setUpMainCharacter in Scene01.swift:

kid = SKSpriteNode(imageNamed: "pg01_kid")
kid.anchorPoint = .zeroPoint
kid.position = CGPoint(x: self.size.width/2 - 245, y: 45)
kid.zPosition = 1
 
addChild(kid)
setUpMainCharacterAnimation()

This should be familiar territory to you by now; you create an SKSpriteNode, set its anchorPoint, position and zPosition, and add it to the scene. Then you call setUpMainCharacterAnimation to set up the main character animation.

setUpMainCharacterAnimation only exists as a shell — time to add some madness to your method! :]

Add the following code to setUpMainCharacterAnimation in in Scene01.swift:

var textures = [SKTexture]()
 
for i in 0..<3 {
  var textureName = "pg01_kid0\(i)"
  var texture = SKTexture(imageNamed: textureName)
 
  textures.append(texture)
}
 
var duration = CGFloat.random() * (6 - 3) + 3
 
var blink = SKAction.animateWithTextures(textures, timePerFrame: 0.25)
var wait = SKAction.waitForDuration(4.5, withRange: 1.5)
 
var mainCharacterAnimation = SKAction.sequence([blink, wait, blink, blink, wait, blink, blink])
kid.runAction(SKAction.repeatActionForever(mainCharacterAnimation))

Animations are performed using a series of images. In the code above, the main character blink animation uses two images to achieve this effect.

The first line creates an array to hold your images. Following that, you generate the image name, create an SKTexture object for each image in the animation, then add that object to your array.

Next you create the animation sequence using the SKAction method animateWithTextures(_:timePerFrame:) class method which expects an array of textures.

Finally, you instruct the kid sprite to perform its action. By using repeatActionForever(_:), you tell the node to run this action continuously.

Build and run your project; hit the “Read Story” button and you’ll see the main character appear on-screen and blink his eyes while the narration plays in the background, as so:

tc_spritekit_build4

An Introduction to Physics

You can really improve the appeal of your book by adding some interactivity. In this section, you’re going to create a hat for the main character which the reader can drag around the screen and place on the main character’s head.

Still working in Scene01.swift, add the following variable to the Scene01:

private var hat = SKSpriteNode(imageNamed: "pg01_kid_hat")

Add the following code to setUpHat() in Scene01.swift:

var label = SKLabelNode(fontNamed: "Thonburi-Bold")
label.text = "Help Mikey put on his hat!"
label.fontSize = 20.0
label.fontColor = UIColor.yellowColor()
label.position = CGPoint(x: 160, y: 180)
 
addChild(label)
 
hat = SKSpriteNode(imageNamed: "pg01_kid_hat")
hat.position = CGPoint(x: 150, y: 290)
hat.physicsBody = SKPhysicsBody(rectangleOfSize: hat.size)
hat.physicsBody?.restitution = 0.5
hat.zPosition = 2
 
addChild(hat)

The first half of the code above creates an SKLabelNode object and adds it to the scene. An SKLabelNode is like the Sprite Kit version of UILabel, which draws a string.

The second half of the code adds the physics to the scene. You created the SKSpriteNode and assigned it to the hat variable as well as an SKPhysicsBody which lets you apply a number of physical characteristics to your objects such as shape, size, mass, and gravity and friction effects.

Giving the hat a physics body sets the shape of the body to match the node’s frame. You also set the zPosition and the restitution to 0.5 which means your physics body will bounce off objects with half of its initial force.

Build and run your project; tap the “Read Story” button and…huh? Where did the hat go?

If you were watching the screen closely, you may have noticed the hat falling off the screen. That’s because there was no opposing body to stop it from falling.

You can fix that by adding a physics body to act as the ground.

Add the following code to setUpFooter() in SeasonsSceneBase.swift, just after the line that calls addChild(_:) to add footer to the scene:

physicsBody = SKPhysicsBody(edgeLoopFromRect: footer.frame)

Build and run your project again; this time, the hat has something to land on — the physics body you set up in the footer. As well, you can see the yellow text label that you created with SKLabelNode, as shown below:

tc_spritekit_build5

Okay, you’ve added some physics properties to the hat — but how do you go about adding interactivity?

Handling Touches and Moving the Hat

This section implements the touch handling for the hat so that you can move it around the screen, as well as touch handling for the Next, Previous and sound preference button.

Add the following block of code immediately after the existing if statement of touchesBegan(_:withEvent:) in SeasonsSceneBase.swift:

else if actionForKey("readText") != nil { // do not turn page if reading
  return
} else if btnRight.containsPoint(location) {
  // Right button goes to the next scene.
  if let nextScene = getNextScene() {
    goToScene(nextScene)
  }
} else if btnLeft.containsPoint(location) {
  // Left button goes to the previous scene
  if let previousScene = getPreviousScene() {
    goToScene(previousScene)
  }
}

Here you check to see if the Next or Previous page buttons were the target of the touch event, much like you did for the sound toggle button. You then handle the touch event by moving to the appropriate scene.

Additionally, there is a check for actionForKey(_:). Recall the key that you set for the action that narrates the text?

runAction(readSequence, withKey:"readText")

The block of code you added above uses that key to check if the readText action is currently playing. If it is, do nothing. If it’s not, turn the page.

But why check at all?

The reason is that when you start an SKAction that plays a sound, it’s impossible to interrupt that sound. This is why you can’t turn the page while the text is being read. As was mentioned earlier, you’ll probably want to use something more robust to narrate the text in your production-level app.

It would be nice to give the reader a way to jump back to the first scene, no matter where they are in the book.

Add the following code right below the code that you added previously, which will take you back to the first scene when the user touches the book title in the footer:

else if location.x >= 29 && location.x <= 285 && location.y >= 6 && location.y <= 68 {
  // Go back to the home scene.
  goToScene(Scene00(size: size))
}

The above code tests the touch location a little differently than the other checks you’ve made. Because the book’s title is part of the footer image, this simply checks to see whether or not the touch location falls within the area where you know the book title appears. It’s usually not a good idea to have “magic numbers” like this strewn about your app, but it serves to keep things simple in this tutorial.

The rest of the above code is exactly like the code that handles the Previous Page button touch events.

Finally, you’ll need to handle touch events on the hat. To do so, you’ll need to store some data between events.

Add the following two variables to the variables of Scene01.swift:

private var touchingHat = false
private var touchPoint: CGPoint?

In the above code, touchingHat stores whether the user is currently touching the hat, while touchPoint stores the most recent touch location.

To ensure the hat catches any touch events that occur both over it and another target area, check the hat first.

To do this, add the following code to touchesBegan(_:withEvent:) in Scene01.swift:

for touch in touches as! Set<UITouch> {
  var location = touch.locationInNode(self)
 
  if hat.containsPoint(location) {
    touchingHat = true
    touchPoint = location
 
    /* change the physics or the hat is too 'heavy' */
 
    hat.physicsBody?.velocity = CGVectorMake(0, 0)
    hat.physicsBody?.angularVelocity = 0
    hat.physicsBody?.affectedByGravity = false
  }
}

When the user first touches the hat, the code above sets the touchingHat flag to true and stores the touch location in touchPoint. It also makes a few changes to the hat’s physics body. These changes are necessary because without them it’s virtually impossible to drag the hat around the screen as you’re constantly fighting with the physics engine.

You’ll need to track the touch as it moves across the screen, so add the following code to touchesMoved(_:withEvent:) in Scene01.swift:

if let touch = touches.first as? UITouch {
  touchPoint = touch.locationInNode(self)
}

Here you update the most recent touch location stored in touchPoint.

When the user stops touching the screen, you need to reset any hat-related data.

Add the following code to both touchesEnded(_:withEvent:) and touchesCancelled(_:withEvent:):

touchingHat = false
hat.physicsBody?.affectedByGravity = true

The above code sets the touchingHat flag to true and re-enables gravity for the hat so that it will fall back to the floor when the user releases it.

There’s just one more thing to do to get the hat to track the user’s finger as it moves on the screen.

Add the following code to update(_:):

if touchingHat {
  touchPoint.x.clamp(hat.size.width / 2, self.size.width - hat.size.width / 2)
  touchPoint.y.clamp(footer.size.height + hat.size.height / 2, self.size.height - hat.size.height / 2)
 
  hat.position = touchPoint
}

update invokes before each frame of the animation renders. Here you check to see if the user is dragging the hat; if it is, change the hat current location to the position stored in touchPoint. You’re using the clamp function from SKTUtils to ensure the hat doesn’t move off the screen or below the footer.

Build and run your project; hit the “Read Story” button, and play around with the hat on the screen a little.

Note: Try building the app both with and without the changes to the hat’s physics bodies to see the difference it makes. You should be able to move the hat around with your finger easily when the changes to the physics body are applied, but it’s a little more difficult to move it around when you comment out the changes.

Moving the hat around is cool, but there isn’t any feedback as to whether or not the hat is on top of the main character’s head. It’s time to add that feedback.

Modify touchesEnded(_:withEvent:) so that it looks like the code below:

if let touch = touches.first as? UITouch where touchingHat {
  var currentPoint: CGPoint! = touch.locationInNode(self)
 
  if currentPoint.x >= 300 && currentPoint.x <= 550 && currentPoint.y >= 250 && currentPoint.y <= 400 {
    currentPoint.x = 420
    currentPoint.y = 330
 
    hat.position = currentPoint
 
    let popSound = SKAction.playSoundFileNamed("thompsonman_pop.wav", waitForCompletion: false)
    hat.runAction(popSound)
  } else {
    hat.physicsBody?.affectedByGravity = true
  }
  touchingHat = false
}

With the above bit of code you can determine if the user is touching the hat and where they attempted to release it. This is why you want to use the end event and not the begin event.

If the user releases the hat close enough to the kid’s head, your code re-positions the hat to an exact location as defined by currentPoint.x and currentPoint.y. You also play a sound to alert the user that the hat is now firmly placed on the main character’s head – which is important! Did you see all that snow outside the window? Brrrrr!

Build and run your project; grab the hat and plunk it down on your character’s head, like so:

tc_spritekit_build7

Aside from the story and the narration, these interactive elements with actions and sounds are key to the experience and really take advantage of what iOS and Sprite Kit have to offer.

Where To Go From Here?

I hope you enjoyed working through this tutorial and that it was a little different from the usual use of Sprite Kit in games. You can download the complete sample project and compare notes if you’d like.

At this point, the rest of the story is up to you! If you check out the completed project download, you’ll find additional scenes for pages 2–4 are there as a bonus.

tc_spritekit_build8

The code there is similar to what you’ve already built, and you should be able to look around and figure out what’s going on in those scenes.

If you’re looking to learn more about Sprite Kit, be sure to check out our book, iOS Games by Tutorials.

If you have any questions or comments, feel free to join in the discussion below!

The post Sprite Kit Tutorial: Create an Interactive Children’s Book with Sprite Kit and Swift appeared first on Ray Wenderlich.

How To Implement A* Pathfinding with Swift

$
0
0
Add the A* Pathfinding Algorithm to this simple Sprite Kit game!

Add the A* Pathfinding Algorithm to this simple Sprite Kit game!

Update note: This tutorial was update to Swift and Sprite Kit by Gabriel Hauber. The original Cocos2D tutorial was written by Johann Fradj.

In this tutorial, you’ll learn how to add the A* Pathfinding algorithm into a simple Sprite Kit game. A* lets you calculate a navigable path between two points, and is especially useful for 2D tile-based games such as CatMaze, which you’ll work on in this tutorial.

If you’re not familiar with the A* algorithm itself, you should read Introduction to A* Pathfinding first for all the details on how it works. This tutorial will cover the implementation of the algorithm into an existing Sprite Kit game.

To go through this tutorial, it’s helpful if you have prior knowledge of the Sprite Kit framework on iOS or OS X. If you need a refresher there, check out our Sprite Kit Tutorials to take you all the way from core concepts to complete games!

Find the shortest path to your keyboard, and begin! :]

Getting Started

Start by downloading the starter project. It’s a simple Sprite Kit based game configured with both iOS and OS X targets. Build and run the project for your platform of choice. If you run the OS X version, you should see the following:

Cat Maze: A simple tile-based Sprite Kit game for iOS and OS X

Cat Maze: A simple tile-based Sprite Kit game for iOS and OS X

In this game, you take the role of a cat thief trying to make your way through a dungeon guarded by dangerous dogs. If you try to walk through a dog they will eat you – unless you can bribe them with a bone! You need to traverse the dungeon in the correct order so you have enough bones when you need them to get through a dog blocking your way as you make your way to the exit.

Note that the cat can only move vertically or horizontally (not diagonally), and will move from one tile center to another. Each tile can be either walkable or unwalkable.

Try out the game, and see if you can reach the exit! When you tap or click somewhere on the map, the cat will jump to an adjacent tile in the direction of your tap. (On the OS X version you can also use the cursor keys to move around).

Cat Maze and A* Overview

In this tutorial you will modify this so that the cat will automatically move all the way to the tile you tapped using the most efficient route, much like many RPGs or point-and-click adventure games.

Open Cat.swift and take a look at the implementation of moveToward(_:). You will see that it calls moveInDirection(_:) with a direction based on the cat’s current position and the target position.

You will change this so that instead of moving one tile at a time, the cat will find the shortest path to the target, and make its way along that path.

First, replace the moveToward(_:) implementation with the following:

func moveToward(target: CGPoint) {
  let toTileCoord = gameScene.tileMap.tileCoordForPosition(target)
  moveTo(toTileCoord)
}

Build and run the game now and make a move; the cat will teleport to the selected location. Well, that’s the end result you’re looking for so congratulations! ;]

The actual move algorithm is in moveTo(_:), which is where you’ll to calculate the shortest path using the A* algorithm, and make the cat follow it.

A Reusable Pathfinder

The A* pathfinding algorithm is the same whether it is finding the best path for a cat in a maze full of dogs, or for a warrior in a dungeon full of monsters. Because of this, the pathfinder you create in this tutorial will be reusable in other projects.

The pathfinding algorithm is going to need to know the following pieces of information from the game:

  • Given a map location, what map locations next to it are valid locations to move to?
  • What is the movement cost between two map locations?

In the project navigator, right-click on the Shared group, and select New File…. Choose the iOS\Source\Swift File template and click Next. Name the file AStarPathfinder and select the Shared directory as the location in which to create the file. Ensure both CatMaze and CatMaze Mac targets are selected and click Create.

Add the following code to the file you just created:

protocol PathfinderDataSource: NSObjectProtocol {
  func walkableAdjacentTilesCoordsForTileCoord(tileCoord: TileCoord) -> [TileCoord]
  func costToMoveFromTileCoord(fromTileCoord: TileCoord, toAdjacentTileCoord toTileCoord: TileCoord) -> Int
}
 
/** A pathfinder based on the A* algorithm to find the shortest path between two locations */
class AStarPathfinder {
  weak var dataSource: PathfinderDataSource?
 
  func shortestPathFromTileCoord(fromTileCoord: TileCoord, toTileCoord: TileCoord) -> [TileCoord]? {
    // placeholder: move immediately to the destination coordinate
    return [toTileCoord]
  }
}

The PathfinderDataSource protocol describes the two requirements listed above: available (walkable) adjacent tiles, and the movement cost. This will be used by the AStarPathfinder, which you’ll fill out with the algorithm later.

Setting up the Game to Use the Pathfinder

In order to use the pathfinder, the Cat object will need to create an instance of it. But what is a good candidate for the pathfinder’s data source?

You have two choices:

  1. The GameScene class. It knows about the map, and so is a candidate for supplying information such as movement costs and what tiles are walkable, and so on.
  2. The Cat class, but why would you choose this? Imagine a game which has multiple moveable character types, each of which has its own rules as to what is a walkable tile and movement costs. For example, a Ghost character may be able to move through walls, but it costs more to do so.

Because you are such a fan of good design, you choose the second option :]

Open Cat.swift and add the following class extension to the bottom of the file so it conforms to PathfinderDataSource:

extension Cat: PathfinderDataSource {
  func walkableAdjacentTilesCoordsForTileCoord(tileCoord: TileCoord) -> [TileCoord] {
    let adjacentTiles = [tileCoord.top, tileCoord.left, tileCoord.bottom, tileCoord.right]
    return adjacentTiles.filter { self.gameScene.isWalkableTileForTileCoord($0) }
  }
 
  func costToMoveFromTileCoord(fromTileCoord: TileCoord, toAdjacentTileCoord toTileCoord: TileCoord) -> Int {
    return 1
  }
}

As you can see, finding the adjacent tiles is really simple: create an array with the tile coordinates around the given tile coordinate, then filter the array to return only walkable tiles.

Because you can’t move diagonally and because terrain is just walkable or unwalkable the cost is always the same. In your other apps, perhaps diagonal movement costs more or there are different terrain types such as swamps, hills, etc.

Now that you’ve implemented the pathfinder’s data source, it’s time to create a pathfinder instance. Add the following properties to the Cat class:

let pathfinder = AStarPathfinder()
var shortestPath: [TileCoord]?

In the initializer, set up the cat as the pathfinder’s data source immediately after the call to super.init():

pathfinder.dataSource = self

In moveTo(_:), find the two lines of code that update the cat’s position and state:

position = gameScene.tileMap.positionForTileCoord(toTileCoord)
updateState()

Replace those two lines with the following:

shortestPath = pathfinder.shortestPathFromTileCoord(fromTileCoord, toTileCoord: toTileCoord)

Once you fill out the pathfinding algorithm, this property will store the list of steps needed to get from point A to point B.

Creating the ShortestPathStep Class

In order to calculate the shortest path using the A* algorithm, you need to know for each of the path’s steps:

  • Location
  • F, G and H scores
  • parent step (so you can trace back along its length from end to beginning)

You’ll capture all this information in a private class called ShortestPathStep.

Add the following code to the top of AStarPathfinder.swift:

/** A single step on the computed path; used by the A* pathfinding algorithm */
private class ShortestPathStep: Hashable {
  let position: TileCoord
  var parent: ShortestPathStep?
 
  var gScore = 0
  var hScore = 0
  var fScore: Int {
    return gScore + hScore
  }
 
  var hashValue: Int {
    return position.col.hashValue + position.row.hashValue
  }
 
  init(position: TileCoord) {
    self.position = position
  }
 
  func setParent(parent: ShortestPathStep, withMoveCost moveCost: Int) {
    // The G score is equal to the parent G score + the cost to move from the parent to it
    self.parent = parent
    self.gScore = parent.gScore + moveCost
  }
}
 
private func ==(lhs: ShortestPathStep, rhs: ShortestPathStep) -> Bool {
  return lhs.position == rhs.position
}
 
extension ShortestPathStep: Printable {
  var description: String {
    return "pos=\(position) g=\(gScore) h=\(hScore) f=\(fScore)"
  }
}

As you can see, this is a very simple class that keeps track of the following:

  • The step’s tile coordinate
  • The G score (the movement cost from the start to the step’s position)
  • The H score (the estimated number of tiles between the current position and destination)
  • The step before this step in the path (the parent)
  • The F score, that is, the score for this tile (calculated by adding G + H).

The class also conforms to the Equatable protocol: two steps are equal if they refer to the same location, regardless of their G or H scores.

Finally, it is also Printable for the purposes of human-friendly debug messages.

Implementing the A* Algorithm

Now the bootstrapping is over and it’s time to write the code to calculate the optimal path! First, add the following helper methods to the AStarPathfinder class:

private func insertStep(step: ShortestPathStep, inout inOpenSteps openSteps: [ShortestPathStep]) {
  openSteps.append(step)
  openSteps.sort { $0.fScore <= $1.fScore }
}
 
func hScoreFromCoord(fromCoord: TileCoord, toCoord: TileCoord) -> Int {
  return abs(toCoord.col - fromCoord.col) + abs(toCoord.row - fromCoord.row)
}

The first method insertStep(_:inOpenSteps:) inserts a ShortestPathStep into the open list at the appropriate position ordered by F score. Note that it modifies the array in-place and is passed in as an inout parameter.

The second method computes the H score for a square according to the Manhattan (or “city block”) method, which calculates the total number of steps moved horizontally and vertically to reach the final desired step from the current step, ignoring any obstacles that may be in the way.

With these helper methods in place, you now have everything you need to implement the pathfinding algorithm itself.

Delete the current placeholder code in shortestPathFromTileCoord(_:toTileCoord:) and replace it with the following:

// 1
if self.dataSource == nil {
  return nil
}
let dataSource = self.dataSource!
 
// 2
var closedSteps = Set<ShortestPathStep>()
var openSteps = [ShortestPathStep(position: fromTileCoord)]
 
while !openSteps.isEmpty {
  // 3
  let currentStep = openSteps.removeAtIndex(0)
  closedSteps.insert(currentStep)
 
  // 4
  if currentStep.position == toTileCoord {
    println("PATH FOUND : ")
    var step: ShortestPathStep? = currentStep
    while step != nil {
      println(step!)
      step = step!.parent
    }
    return []
  }
 
  // 5
  let adjacentTiles = dataSource.walkableAdjacentTilesCoordsForTileCoord(currentStep.position)
  for tile in adjacentTiles {
    // 6
    let step = ShortestPathStep(position: tile)
    if closedSteps.contains(step) {
      continue
    }
    let moveCost = dataSource.costToMoveFromTileCoord(currentStep.position, toAdjacentTileCoord: step.position)
 
    if let existingIndex = find(openSteps, step) {
      // 7
      let step = openSteps[existingIndex]
 
      if currentStep.gScore + moveCost < step.gScore {
        step.setParent(currentStep, withMoveCost: moveCost)
 
        openSteps.removeAtIndex(existingIndex)
        insertStep(step, inOpenSteps: &openSteps)
      }
 
    } else {
      // 8
      step.setParent(currentStep, withMoveCost: moveCost)
      step.hScore = hScoreFromCoord(step.position, toCoord: toTileCoord)
 
      insertStep(step, inOpenSteps: &openSteps)
    }
  }
 
}
 
return nil

This is an important method, so let’s take it section by section:

  1. If there’s no valid data source then you can exit early. If there is one, you set up a shadowed local variable to unwrap it.
  2. Set up the data structures to keep track of the steps. The open steps list starts with the initial position.
  3. Remove the lowest F cost step from the open list and add it to the closed list. Because the list is ordered, the first step is always the one with the lowest F cost.
  4. If the current step is the destination, you’re done! For now, you’re just logging the path out to the console.
  5. Get the adjacent tiles coordinates of the current step and begin looping through them.
  6. Get the step and check that it isn’t already in the closed list. If not, calculate the movement cost.
  7. If the step is in the open list, then grab that version of the step. If the current step and movement score is better than the old score, then replace the step’s existing parent with the current step.
  8. If the step isn’t in the open list then compute the H score and add it.

Build and run to try it out! If you touch the tile shown below:

Selecting a destination tile in Cat Maze

You should see this in the console:

pos=[col=22 row=3] g=9 h=0 f=9
pos=[col=21 row=3] g=8 h=1 f=9
pos=[col=20 row=3] g=7 h=2 f=9
pos=[col=20 row=2] g=6 h=3 f=9
pos=[col=20 row=1] g=5 h=4 f=9
pos=[col=21 row=1] g=4 h=3 f=7
pos=[col=22 row=1] g=3 h=2 f=5
pos=[col=23 row=1] g=2 h=3 f=5
pos=[col=24 row=1] g=1 h=4 f=5
pos=[col=24 row=0] g=0 h=0 f=0

Remember the path is built backwards, so you have to read from bottom to top to see what path the algorithm has chosen. Try to match these up to the tiles in the maze so you can see that it really is the shortest path!

Following the Yellow Brick Path

Now that the path is calculated, you just have to make the cat follow it. The cat will have to remember the whole path, and then follow it step by step.

Open Cat.swift and find moveTo(_:). Add the following code to the end of the method, after the line that sets shortestPath:

if let shortestPath = shortestPath {
  for tileCoord in shortestPath {
    println("Step: \(tileCoord)")
  }
}

The pathfinder isn’t actually yet returning the path, so switch to AStarPathfinder.swift. Remember that when the algorithm finishes, it has a path from the final step back to the beginning. This needs to be reversed, and returned to the caller as an array of TileCoords instead of ShortestPathSteps.

Add the following helper method to the AStarPathfinder class:

private func convertStepsToShortestPath(lastStep: ShortestPathStep) -> [TileCoord] {
  var shortestPath = [TileCoord]()
  var currentStep = lastStep
  while let parent = currentStep.parent { // if parent is nil, then it is our starting step, so don't include it
    shortestPath.insert(currentStep.position, atIndex: 0)
    currentStep = parent
  }
  return shortestPath
}

You’re reversing the array by inserting each step’s parent at the beginning of an array until the beginning step is reached.

Inside shortestPathFromTileCoord(_:toTileCoord:), find the block of code inside the if currentStep.position == toTileCoord { statement that logs out the path. Replace it with the following code:

return convertStepsToShortestPath(currentStep)

This will run your helper method to put the steps in the correct order and return that path.

Build and run. If you try moving the cat, you should see this in the console:

Step: [col=24 row=1]
Step: [col=23 row=1]
Step: [col=22 row=1]
Step: [col=21 row=1]
Step: [col=20 row=1]
Step: [col=20 row=2]
Step: [col=20 row=3]
Step: [col=21 row=3]
Step: [col=22 row=3]

Yes! Now you have tile coordinates ordered from start to finish (instead of reversed), nicely stored in an array for you to use.

Getting the Cat on the Path

The last thing to do is to go through the shortestPath array and animate the cat to follow the path.

Add the following method to the Cat class:

func popStepAndAnimate() {
  if shortestPath == nil || shortestPath!.isEmpty {
    // done moving, so stop animating and reset to "rest" state (facing down)
    removeActionForKey("catWalk")
    texture = SKTexture(imageNamed: "CatDown1")
    return
  }
 
  // get the next step to move to and remove it from the shortestPath
  let nextTileCoord = shortestPath!.removeAtIndex(0)
  println(nextTileCoord)
  // determine the direction the cat is facing in order to animate it appropriately
  let currentTileCoord = gameScene.tileMap.tileCoordForPosition(position)
 
  // make sure the cat is facing in the right direction for its movement
  let diff = nextTileCoord - currentTileCoord
  if abs(diff.col) > abs(diff.row) {
    if diff.col > 0 {
      runAnimation(facingRightAnimation, withKey: "catWalk")
    } else {
      runAnimation(facingLeftAnimation, withKey: "catWalk")
    }
  } else {
    if diff.row > 0 {
      runAnimation(facingForwardAnimation, withKey: "catWalk")
    } else {
      runAnimation(facingBackAnimation, withKey: "catWalk")
    }
  }
 
  runAction(SKAction.moveTo(gameScene.tileMap.positionForTileCoord(nextTileCoord), duration: 0.4), completion: {
    let gameOver = self.updateState()
    if !gameOver {
      self.popStepAndAnimate()
    }
  })
}

This method pops one step off the array and animates the movement of the cat to that position. The cat has a walking animation too, so the method will start and stop that as appropriate in addition to ensuring the cat is facing in the correct direction.

At the end of the method inside the runAction(_:duration:completion:) call, you need to update the game’s state to check for dogs, bones, etc. and then schedule another call to popStepAndAnimate() if there’s another step along the path.

Finally, change the code in moveToward(_:) to call popStepAndAnimate() instead of printing out the steps:

if let shortestPath = shortestPath {
  popStepAndAnimate()
}

Build and run, and. . .

Aww, yeah!

The cat automatically moves to the final destination that you touch, collects bones and vanquishes menacing dogs! :]

Note: As you play the game, you’ll see that if you select a new location before it has reached the previous one, the cat moves strangely. That’s because the current movement path is interrupted and another one started. Since this tutorial isn’t focused on gameplay, we’ll gloss over this for now although the final project download at the end of this tutorial has this issue fixed – in Cat.swift look for references to currentStepAction and pendingMove.

Congratulations! You have now implemented A* pathfinding in a simple Sprite Kit game from scratch! :]

Bonus: Diagonal Movement

What if you also want to let the cat move diagonally?

You just have to update two functions:

  • walkableAdjacentTilesCoordForTileCoord(_:): include the diagonal tiles as well.
  • costToMoveFromTileCoord(_:toAdjacentTileCoord:): calculate an appropriate movement cost for diagonal movement.

You might wonder how you should compute the cost for diagonal movement. This is actually quite easy with some simple math!

The cat is moving from the center of a tile to another, and because the tiles are squares, A, B and C form a right-angled triangle as in the diagram below:

Using the pythagorean theorem for calculating movement cost for diagonals

According to the Pythagorean Theorem, C² = A² + B², so:

C = √(A² + B²)
with A = B = 1 (The movement cost to move from a square to another = G cost)
C = √(2)
C ≈ 1.41

So the cost to move diagonally is 1.41. In comparison, moving left then up costs 2, so diagonal movement is clearly better!

Now, computing with integers is more efficient than floats, so instead of using floats to represent the cost of a diagonal move, you can simply multiply the costs by 10 and round them. So horizontal and vertical moves will cost 10, and diagonal moves 14.

Time to try this out! First replace costToMoveFromTileCoord(_:toAdjacentTileCoord:) in Cat.swift:

func costToMoveFromTileCoord(fromTileCoord: TileCoord, toAdjacentTileCoord toTileCoord: TileCoord) -> Int {
  return (fromTileCoord.col != toTileCoord.col) && (fromTileCoord.row != toTileCoord.row) ? 14 : 10
}

In order to add the diagonals to the walkable adjacent tiles, first add some new helper properties to the TileCoord class in TileMap.swift:

/** coordinate top-left of self */
var topLeft: TileCoord {
  return TileCoord(col: col - 1, row: row - 1)
}
/** coordinate top-right of self */
var topRight: TileCoord {
  return TileCoord(col: col + 1, row: row - 1)
}
/** coordinate bottom-left of self */
var bottomLeft: TileCoord {
  return TileCoord(col: col - 1, row: row + 1)
}
/** coordinate bottom-right of self */
var bottomRight: TileCoord {
  return TileCoord(col: col + 1, row: row + 1)
}

Back in Cat.swift, modify walkableAdjacentTilesForTileCoord(_:) to return the diagonally adjacent squares:

func walkableAdjacentTilesCoordsForTileCoord(tileCoord: TileCoord) -> [TileCoord] {
  var canMoveUp = gameScene.isWalkableTileForTileCoord(tileCoord.top)
  var canMoveLeft = gameScene.isWalkableTileForTileCoord(tileCoord.left)
  var canMoveDown = gameScene.isWalkableTileForTileCoord(tileCoord.bottom)
  var canMoveRight = gameScene.isWalkableTileForTileCoord(tileCoord.right)
 
  var walkableCoords = [TileCoord]()
 
  if canMoveUp {
    walkableCoords.append(tileCoord.top)
  }
  if canMoveLeft {
    walkableCoords.append(tileCoord.left)
  }
  if canMoveDown {
    walkableCoords.append(tileCoord.bottom)
  }
  if canMoveRight {
    walkableCoords.append(tileCoord.right)
  }
 
  // now the diagonals
  if canMoveUp && canMoveLeft && gameScene.isWalkableTileForTileCoord(tileCoord.topLeft) {
    walkableCoords.append(tileCoord.topLeft)
  }
  if canMoveDown && canMoveLeft && gameScene.isWalkableTileForTileCoord(tileCoord.bottomLeft) {
    walkableCoords.append(tileCoord.bottomLeft)
  }
  if canMoveUp && canMoveRight && gameScene.isWalkableTileForTileCoord(tileCoord.topRight) {
    walkableCoords.append(tileCoord.topRight)
  }
  if canMoveDown && canMoveRight && gameScene.isWalkableTileForTileCoord(tileCoord.bottomRight) {
    walkableCoords.append(tileCoord.bottomRight)
  }
 
  return walkableCoords
}

Note that the code to add the diagonals is a bit more complicated than for horizontal/vertical movement. Why is this?

The code enforces the rule that if the cat is to move, say, diagonally to the tile to the bottom-left, it must also be able to move in both the down and left directions. This prevents the cat from walking through walls. The following diagram illustrates this:

Avoiding walking through corners with the A* pathfinding algorithm

In the diagram,

  • O = Origin
  • T = Top
  • B = Bottom
  • L = Left
  • R = Right
  • TL = Top – Left

Take for example the case shown in the top left of the above image.

The cat wants to go from the origin (O) to the bottom left diagonal square. If there is a wall to the left or below (or both), then moving diagonally would cut through the corner of a wall (or two). So the bottom-left direction is open only if there is no wall to the left or below.

Tip: You can simulate different type of terrain by updating costToMoveFromTileCoord(_:toAdjacentTileCoord:) to take the terrain type into consideration. Lowering the cost will result in the cat preferring those squares; increasing it will cause the cat to tend to avoid the squares if a better route is available.

Build and run your project. Verify that the cat does, indeed, take the diagonal when it can.

A challenge

How would you give the cat automatic dog-avoidance behaviour? Rather than having to manually navigate the cat around a dog to get to a bone, what could you do to have the cat automatically choose the dog-free route if one is available?

Solution Inside: Avoid Dogs SelectShow>

Where to Go From Here?

Download the final project with all of the code from the tutorial (including diagonal movements and dog avoidance behaviour).

Congratulations, you now know the basics of the A* algorithm and how to implement it! You should be able to:

  • Implement the A* algorithm in your own game
  • Refine it as necessary (by allowing different kind of terrain, better heuristics, etc…) and optimize it

A great place to go to for more information on the A* algorithm is Amit’s A* Pages.

If you have any questions or comments about this tutorial, please join the forum discussion below!

The post How To Implement A* Pathfinding with Swift appeared first on Ray Wenderlich.

Video Tutorial: What’s New in watchOS 2 Part 2: Animation

Video Tutorial: What’s New in watchOS 2 Part 3: Complications I

Readers’ App Reviews – July 2015

$
0
0
Emoji

Hot and Sunny Apps!

It’s been a great July, filled with warm summer air and great apps made by fellow readers!

As always, my mission is to try em out and showcase a few to give a snapshot of the community each month.

July is no different. We’ve got some fantastic apps to show off this month:

  • A messaging app for those who can’t wait
  • Apps for taking notes
  • A game testing your emoji prowess
  • And of course, much more!

Keep reading to see a snapshot of what the fantastic RW community built this month.

Super Twenty – impossible puzzle game

SuperTwenty
Super Twenty is a super addicting tile addition game.

Super Twenty doesn’t have any special numbers to multiply; it merely increments tiles when matching numbers are combined. Combine 1’s to get a 2, combine 2’s to get a 3, and so forth.

Its extremely simple, but extremely fun. It doesn’t even lock you down to sliding the tiles in particular directions like similar games.

Being so simple is what makes Super Twenty so much more addictive that similar games. There is always another move you can make, you’re not left hunting. I could barely stop playing long enough to write this. ;]

Mean Calculator

MeanCalculator
Mean Calculator is a fun new math game great for a little fun on the go.

Mean Calculator gives you a number and your goal is to tap out an equation to match. Each level will lock some of the calculator keys to make things a little extra challenging but most levels have multiple possible solutions. It might give you a 4 so you could do 1 + 3, 2 + 2, 2 * 2, etc. But maybe it blocks the multiplication button this level, leaving you to lowly addition.

Mean Calculator also gives great feedback remanent of your mean 1st grade teacher giving a loss enough extra sting you want to keep going to prove the calculator wrong.

Quick Emoji

Emoji
Quick Emoji is a unique game build entirely around our uncanny ability to memorize sequences.

Quick Emoji shows an emoji on the screen and gives you seconds or less to find it on the regular emoji keyboard. Its surprisingly easy for ones you use often yet surprisingly hard for ones you’ve never sent. You’ll remember where emoji are you didn’t know you had remembered.

Its quite fun to hunt down emoji at breakneck speed.

Don’t Crack

DontCrack
Don’t Crack is a unique, fun game centered around not breaking vases as they come off the assembly line.

As the vases roll off the assembly line one at a time, you control the packing peanut release. Vases are in a crate and you must fill it with enough packing peanuts to make sure it won’t break during shipping. But you’ll have to be sparing, there are a limited amount of packing peanuts and you only get filled up every so often.

Timing is everything as each crate is slightly different sized with different shape vases. The physics make sure you’ll need to leave enough time for the light peanuts to fall.

GameCenter integration will ensure you can verify your bragging rights for your master packer status.

Shuff

Shuff
Shuff is a fun and powerful realtime messenger for your iPhone.

Shuff lets you see realtime messages from your friends, letter by letter. It has has a digital whiteboard where you can draw in realtime with your friends as well.

Shuff doesn’t save your messages as a history. They’re only available until you read them making it easier to keep them private.

Shuff also allows anonymous communication for even more privacy. You can even choose to shuffle the letters of messages making it harder for prying eyes to read.

Contacts Pad

ContactsPad
We all have a favorites list for contacts on our phones, but what about connecting to them in other ways?

Contacts Pad will let you select your favorite contacts for quick access to a bunch of them from a single screen. But Contacts Pad is not just for calling. Contacts Pad lets you quickly connect over iMessage, SMS, Whats App, Facebook, Twitter, Skype, and more.

Contacts Pad also has a Notification Center widget that makes starting a chat, call, or Facetime session even faster.

ScratchTones

ScratchTones

ScratchTones is a full featured, multitrack recording studio on the go!

ScratchTones supports up to four individual tracks with independent playback, volume, balance, and more. The mixer board will allow you to mix tracks, control effects like reverb, equalizers, and a master volume.

You can keep all your ScratchTones to review, trim, edit, playback, or even remix into new tones. You can also download professionally recorded clips, both free and paid, to augment your mixes.

ScratchTones even allows recordings to be uploaded to SoundCloud in addition to sharing on Twitter, Facebook, and more.

Alara

Alara
We all enjoy beautiful sunny days, but did you know too much exposure can be dangerous? UV overexposure can cause sunburns, blisters and even longer term problems like skin cancer and cataracts. Variables in weather and season can mean UV radiation changes constantly, even hourly.

Alara provides hourly forecast information for UV radiation for your current location and gives simple recommendations to protect yourself if you must be outside.

With Alara’s help and some thoughtful planning, you can enjoy your sun during the safe periods and relax inside when its a bit too much UV for comfort.

Ovrlay

Ovrlay
Ovrlay is a very cool app that will allow you to overlay anything you’d like such as a floor plan for your offices on top of Apple’s native maps.

Adding an Ovrlay as easy as it should be. Simple find the location and add your image. Then you can pinch, zoom, and rotate your image until it lines up just right with the map. You can even highlight places within the overlay just as bathrooms, stairs, main rooms, etc. There is even the ability to add routing information to the overlay to enable navigation through the overlay.

You can share your overlays with the world and other users can find overlays of their favorite buildings throughout the world.

PayBacks

PayBacks
PayBacks is a very good looking small loan journal to help keep track of money between friends.

PayBacks lets you track small loans easily so you remember who owes how much for what and when. You can track money you’re owed or that you owe. You can set reminders to make sure you remember to pay or collect. And you can remind your friends if they’re forgetful.

PayBacks is a simple app that looks great. Perfect for remembering the small change.

Geeksbox

Geeksbox
Geeksbox is a social network for geeks!

Geeksbox is a basic social network built for geeks. Post your latest cosplay costume, homemade arcade machine, urban garden, whatever! Other geeks can rate, review, and comment on your latest posts.

What makes Geeksbox special is the community of geeks its built around. Its not for anyone, its a network focused entirely on those who like what you like. You can even filter it down more to just the things you really like.

Geeksbox is a free app, so download it and start geeking out!

Index Card 4 – Corkboard Writing App

IndexCards

Index Card is a simple app that helps you capture, organize, and compile your ideas. Authors of all kinds can benefit from organizing their thoughts, and Index Card is a great way to start on the iPad.

Index Cards are fully formatted supporting rich text and images. You can search titles, bodies, notes, etc. Using the built in Storyboard feature, you can organize notes into a linear story.

Its easy to share your work with exports to RTF, PDF, 3-Card Layouts and more. Share native files with other Index Card users as well. Index Cards even has a presentation mode for when you AirPlay to a TV.

Index Cards is packed with features and more to come. Definitely worth a look if you need someone to collect your thoughts.

Intellie Notes

ItelliNotes
Intellie Notes is a clean note taking app for your Mac that focuses on keeping things simple.

In Intellie Notes you don’t clutter up notes with tags and folders, instead you assign colors to notes. You decide what you’d like colors to mean for your own organization. This makes it very easy to scan for the note you’re looking for without reading. And you can easily filter by a particular color if you know what you’re looking for.

Intellie Notes uses iCloud for its syncing to make sure your notes are tracked across multiple devices automatically.



Honorable Mentions

Each and every month I get more submissions than I have time to review. I download every app and see what its about, but I can’t write about them all. I love seeing all the apps submitted by readers like you. Its no a popularity contest or even a favorite picking contest. I just try to get a glimpse of what the community is working on through your submissions. Take a moment and checkout these other great apps I didn’t have time to showcase properly.

Oquonie
Escape Games 274
Boss Money
Luke Carelsen
Tiny Hugs – Endless Adventure
Medieval Monsters
Shy Selfie
MathsMatch
Tumble! – Experiments
Postys – Video digest.
Woodventure – mahjong connect
Lumps of Clay
BugRunner
Yet Another (Watch) Puzzle Game
Loops – A Game About Reflexes
Math Brain for Your Apple Watch
Who Likes the Rain?
By Marco mignano View More by This Developer
Delta: It’s All About Four
Slide – A Game About Timing
Viable
Escape Games 302
Escape Games 303



Where To Go From Here?

Each month, I really enjoy seeing what our community of readers comes up with. The apps you build are the reason we keep writing tutorials. Make sure you tell me about your next one, submit here!

If you saw an app your liked, hop to the App Store and leave a review! A good review always makes a dev’s day. And make sure you tell them you’re from raywenderlich.com; this is a community of makers!

If you’ve never made an app, this is the month! Check out our free tutorials to become an iOS star. What are you waiting for – I want to see your app next month!

The post Readers’ App Reviews – July 2015 appeared first on Ray Wenderlich.

Video Tutorial: What’s New in watchOS 2 Part 4: Complications II

Integrating Parse and React Native for iOS

$
0
0
Combine the power of a Parse backend + React Native iOS frontend!

Combine the power of a Parse backend + React Native iOS frontend!

React Native, introduced at the 2015 Facebook F8 Developer Conference, lets you build native iOS apps using the same concepts found in the popular JavaScript UI library React. The same event also gave us Parse+React, which brings the React view concepts to the data layer.

Parse supports rapid development of your mobile apps by handling your application’s infrastructure needs for you. Parse services include data storage, social integration, push notifications, and analytics, along with client SDKs for various platforms such as iOS, Android, and JavaScript. Parse+React is built on top of the JavaScript SDK and provides hooks into React to make it easy to query and store data on Parse.

In this tutorial, you’ll learn more about React and to use it to build native apps. You’ll build upon the sample PropertyFinder application from our introductory tutorial React Native: Building Apps with JavaScript. Be sure to check out that tutorial for all the React Native basics before continuing on here with integrating Parse.

Getting Started

To get started with the tutorial, download the starter project and unzip it.

This is essentially the same PropertyFinder application with one important difference. Can you spot the difference by looking at some of the JavaScript files?

The original application used ECMAScript 6 (ES6) but the starter project for this tutorial doesn’t. This is because Parse+React relies on mixins to bring Parse functionality into React objects, which isn’t supported for ES6 classes. A future update to React that adds supports for the key observe API will remove the need for using a mixin.

Make sure the React Native pre-requisites are installed. This should be the case if you worked through the previous tutorial.

In v8.0, React Native moved from using Node.js to io.js. If you don’t have io.js installed, set it up via homebrew by executing the following in a Terminal window:

brew unlink node
brew install iojs
brew link iojs --force

This removes node from your path, downloads the latest version of io.js, and tells homebrew to run io.js whenever you run node.

Next, verify that the starter project is set up correctly. Open Terminal, go to the ParseReactNative-PropertyFinder-Starter directory and execute the following command:

npm install

Next, open PropertyFinder.xcodeproj then build and run the project. The simulator will start and display the app’s home page. Test that you’re able to search for UK listings as you did in the previous tutorial:

previous_app

If everything looks good, then you’re ready to integrate Parse+React with your app.

The Parse+React Structure

The Parse+React layer sits on top of both React Native and the Parse JavaScript SDK:

Architecture

It’s available via npm or GitHub.

Parse+React brings the same simplicity to your data management that React brings to your view layers. To understand how this works, consider the React component lifecycle shown below:

Component Life

Parse+React mimics this flow by hooking into your component’s lifecycle. You first set up queries you want to associate with your component. Parse+React then subscribes your component to receive the query results, fetches the data in the background, passes it back to your component and triggers the component’s render cycle like so:

Data Fetch Lifecycle

Co-locating the Parse query in your component with the view makes it much easier to understand your code. You can look at your component code and see exactly how the queries tie into the view.

Modeling Your Property Data

In this tutorial you’ll take out the calls to the Nestoria API and replace them with calls to Parse. In the next few sections, you’ll see how to set up Parse to do this.

Creating Your Parse App

The first thing you should do is sign up for a free Parse account if you haven’t done so. Next, go to the Parse Dashboard and click Create a new App:

Create new Parse app

Name your app PropertyFinder, then click Create. You should see a success note as well as a link to grab your Parse application keys. Click that link and note your Application ID and JavaScript Key. You’ll need these later.

Click Core from the top menu to go to the Data Browser view, where you can see any data stored on Parse for your app. You should see a page stating that you have no classes to display. You’re going to take care of that by creating dummy data to represent property listings that you’ll pull into your app later on in the tutorial.

Defining your Schema

You can use the data displayed in the current PropertyFinder app to figure out what your schema should be.

Open SearchResults.js and examine the renderRow function. Look for the fields from the Nestoria API that display the data. Next, open PropertyView.js and look at the render function to determine if there’s any additional information you’ll need for your schema.

When you’re done, your list of required data elements should match the following:

  • img_url
  • price_formatted
  • title
  • property_type
  • bedroom_number
  • bathroom_number
  • summary

Now you need to create a class in Parse with these fields to represent a property listing. In your Parse Data Browser, click Add Class and name your class Listing:

Create Parse class

Once you click Create Class, you should see a new Listing class added to the left-hand side. There are other types of classes you can create, such as User, which has some special methods and properties not present in a custom class.

However, your custom class will serve the needs of your app just fine. Think of a Parse class as a database table; the columns you’ll define next are similar in concept to database columns.

Click + Col to add a new column:

Add column to a Parse class

Select File from the type selection, enter img_url, then click Create Column:

Add img column

You should see a new column appear in the header of your class. Parse supports many data types including string, number, boolean, and binary data. Here you’re using the File type to store binary data that represents a property’s image.

Next, add a column to represent the price. To keep things simple, instead of naming the column price_formatted name it price, and select Number for the column type.

Now carry on and create columns for the rest of the fields as follows:

  • title: Type String
  • property_type: Type String
  • bedroom_number: Type Number
  • bathroom_number: Type Number
  • summary: Type String

Verify that all the columns and their types look like the ones shown below:

After columns added

parse_columns_all_added_2

You may have noticed some starter columns were already there when you added the class. Here’s what those are for:

  • objectId: uniquely identifies a row. This ID is auto-generated.
  • createdAt: contains the current timestamp when you add a new row.
  • updatedAt: the time you modified a row.
  • ACL: contains the permission information for a row. ACL stands for “access control list”.

Adding Some Sample Data

You’re almost ready to touch some actual code — oh the anticipation! :] But you’ll need to add some data to work with first.

You can download some sample property photos in this zip file. Download and unzip the file; the photos are contained in the Media directory.

Still within the Data Browser on the Parse site, click + Row or + Add a row. Double-click inside the new row’s img_url column to upload a photo. The label should change from undefined to Upload File as shown below:

parse_upload_file

Click Upload File, browse to house1.jpeg, then click Open. The Data Browser should now show a new row with img_url set:

parse_img_added

You should also see the objectId, createdAt, updatedAt and ACL columns set appropriately. By default, the ACL permission is set to public read and write.

Click Security and change the Listing class permission to public read only:

parse_secure_class

Click Save CLP. Note that the class level permission will supercede an individual row’s permission setting.

Note: There are many options you can use to secure your data. You can learn more from this series of blog posts from Parse.

Continue filling in data for this new row as follows:

  • price: 390000
  • title: Grand mansion
  • property_type: house
  • bedroom_number: 5
  • bathroom_number: 4
  • summary: Luxurious home with lots of acreage.

Armed with this pricely listing, you’re ready to modify your app and test your Parse setup.

Swapping in Parse Calls

It’s finally time to get your hands on the code! You’ll start by retrieving all listings on Parse, without any filtering to begin.

Modifying the Query Logic

Open package.json and add the following two new dependencies:

{
  "name": "PropertyFinder",
  "version": "0.0.1",
  "private": true,
  "scripts": {
    "start": "node_modules/react-native/packager/packager.sh"
  },
  "dependencies": {
    "react-native": "^0.8.0",
    "parse": "^1.5.0",
    "parse-react": "^0.4.2"
  }
}

Don’t forget to add a comma (,) to the end of the react-native dependency. With this change, you’ve added Parse and Parse+React to your list of dependencies.

Use Terminal to navigate to your project’s main directory and execute the following command:

npm install

This should pull in the dependencies you just added. You should some output similar to the following:

parse-react@0.4.2 node_modules/parse-react
 
parse@1.5.0 node_modules/parse
└── xmlhttprequest@1.7.0

Next, you’ll initialize Parse and add your credentials to the app.

Open index.ios.js and add the following line beneath the other require statements, but before the destructuring assignment of AppRegistry and StyleSheet:

var Parse = require('parse').Parse;

This loads the Parse module and assigns it to Parse.

Add the following code after the destructuring assignment:

Parse.initialize(
  'YOUR_PARSE_APPLICATION_ID',
  'YOUR_PARSE_JAVASCRIPT_KEY'
);

Replace YOUR_PARSE_APPLICATION_ID with your Parse application ID and YOUR_PARSE_JAVASCRIPT_KEY with your Parse JavaScript key. You did write down your Parse application ID and JavaScript key, didn’t you? :] If not, you can always go to the Parse Dashboard and look at the Settings page for your app to find them again.

Open SearchPage.js to make your query logic changes. Add the following code near the top of the file, beneath the React require statement:

var Parse = require('parse').Parse;
var ParseReact = require('parse-react');

This loads the Parse and Parse+React modules and assigns them to Parse and ParseReact respectively.

Next, update the SearchPage declaration to add the Parse+React mixin to the component, just above getInitialState:

var SearchPage = React.createClass({
  mixins: [ParseReact.Mixin],

A React mixin is a way to share functionality across disparate components. It’s especially useful when you want to hook into a component’s lifecycle. For example, a mixin could define a componentDidMount method. If a component adds this mixin, React will call the mixin’s componentDidMount hook as well as the component’s componentDidMount method.

ParseReact.Mixin adds lifecycle hooks into a component when it’s mounted or it’s about to update. The mixin looks for an observe method where the Parse queries of interest are defined.

Add the following method to your component after the getInitialState definition:

observe: function(props, state) {
  var listingQuery = (new Parse.Query('Listing')).ascending('price');
  return state.isLoading ?  { listings: listingQuery } : null;
},

This sets up a Parse.Query for Listing data and adds a query filter to sort the results by least expensive first. This query executes whenever isLoading is true — which is the case whenever you initiate a search.

The results from Parse.Query will be attached to this.data.listings based on the key — listings — that’s paired with the listingQuery query.

Modify _executeQuery as shown below to only set the loading flag for now, rather than perform the call to the server:

_executeQuery: function() {
  this.setState({ isLoading: true });
},

Next, modify onSearchPressed to call _executeQuery:

onSearchPressed: function() {
  this._executeQuery();
},

Right now you’re not using the search term and loading all records instead; you’ll add this later on in the tutorial.

In a similar fashion, modify onLocationPressed to call _executeQuery as follows:

onLocationPressed: function() {
  navigator.geolocation.getCurrentPosition(
    location => {
      this._executeQuery();
    },
    error => {
      this.setState({
        message: 'There was a problem with obtaining your locaton: ' + error
      });
  });
},

Again, you’re calling `_executeQuery()` without using the location information just yet.

To clean up after yourself, delete _handleResponse and urlForQueryAndPage since you have no more need for these response handlers. Ahh, deleting code is so satisfying, isn’t it? :]

This is a good point to test your fetching logic. ParseReact.Mixin forces a re-rendering of your component whenever the results return.

Add the following statement to render just after the point where you set up the spinner:

console.log(this.data.listings);

This logs the listing data each time you render the component — including after you run the query.

Close the React Native packager window if it’s running so you can start afresh.

Open PropertyFinder.xcodeproj and build and run; the simulator will start and display the same UI you know and love from the original app:

reactparse-1

Tap Go and check the Xcode console. You should see some output like this:

2015-06-05 10:14:27.028 [info][tid:com.facebook.React.JavaScript] []
2015-06-05 10:14:27.589 [info][tid:com.facebook.React.JavaScript] [{"id":{"className":"Listing","objectId":"vbHwqDH6n5"},"className":"Listing","objectId":"vbHwqDH6n5",
"createdAt":"2015-06-05T16:23:14.252Z","updatedAt":"2015-06-05T16:24:39.842Z","bathroom_number":4,"bedroom_number":5,
"img_url":{"_name":"tfss-30d28f46-1335-45d7-8d72-e02684c17d25-house1.jpeg",
"_url":"http://files.parsetfss.com/ec34afd8-2b15-4aea-a904-c96e05b4c83a/tfss-30d28f46-1335-45d7-8d72-e02684c17d25-house1.jpeg"},
"price":390000,"property_type":"house","summary":"Luxurious home with lots of acreage.","title":"Grand mansion"}]

The listing data is empty initially, but once you fetch the data it contains the single listing retrieved from Parse. You may also have noticed that the spinner remains once you’ve gotten the search results. This is because you haven’t done anything to properly handle the results. You’ll take care of this later on, but first you’ll take a brief detour into some UI modifications to use Parse data.

Modifying the UI

Open PropertyView.js and remove the following line in render:

var price = property.price_formatted.split(' ')[0];

You no longer have to worry about reformatting price data, since now you have control over the input data.

Modify the related display code, so that instead of accessing the price variable you just deleted, it uses the price property:

<View style={styles.heading}>
  <Text style={styles.price}>${property.price}</Text>

Also notice that you’re now in US territory, so you’ve deftly changed the currency symbol from pounds to dollars. :]

Next, modify the code to access the image’s URI as follows:

<Image style={styles.image}
  source={{uri: property.img_url.url()}} />

You add the call to url() to access the actual image data in Parse. Otherwise, you’d only get the string representation of the URL.

Open SearchResults.js and make a similar change in renderRow by deleting this line:

var price = rowData.price_formatted.split(' ')[0];

You’ll have to modify the related display code like you did before. Since it’s now showing a dollar value, not a pound value, modify the code as follows:

<View style={styles.textContainer}>
  <Text style={styles.price}>${rowData.price}</Text>

Next, update the image access as shown below:

<Image style={styles.thumb}
  source={{ uri: rowData.img_url.url() }} />

Still in SearchResults.js, change rowPressed to check a different property for changes:

rowPressed: function(propertyGuid) {
  var property = this.props.listings
    .filter(prop => prop.id === propertyGuid)[0];

Parse+React identifies unique rows through an id property; therefore you’re using this property instead of the guid.

Similarly, change the implementation of getInitialState to the following:

getInitialState: function() {
  var dataSource = new ListView.DataSource({
    rowHasChanged: (r1, r2) => r1.id !== r2.id
  });
  return {
    dataSource: dataSource.cloneWithRows(this.props.listings)
  };
},

Finally, modify renderRow to use id:

<TouchableHighlight onPress={() => this.rowPressed(rowData.id)}
    underlayColor='#dddddd'>

These changes will use id instead of guid to match up the data records properly.

You’re not quite ready to test your UI code changes. You’ll first need to modify the data fetching logic to transition to your new UI.

Handling the Results

Open SearchPage.js to properly handle your query results. You’ll be camped in this file for the rest of the tutorial, so get comfortable! :]

Earlier on in this tutorial, your data fetch simply logged the results. Remove the debug statement in render as it’s no longer needed:

console.log(this.data.listings);

To properly handle the results, you’ll reset the loading flag in the SearchPage component and navigate to the SearchResults component with the listing data. Keep in mind that ParseReact.Mixin forces a re-rendering of your component whenever the results return.

How can you detect that a fetched result triggered the rendering? Furthermore, where should you check this and trigger the navigation?

ParseReact.Mixin exposes pendingQueries, which returns an array with the names of the in-progress queries. During the search, you can check for a zero length array to indicate the results have returned and hook your completion check in componentDidUpdate that triggers post-render.

Add the following method just above render:

componentDidUpdate: function(prevProps, prevState) {
  if (prevState.isLoading && (this.pendingQueries().length == 0)) {
    this.setState({ isLoading: false });
    this.props.navigator.push({
      title: 'Results',
      component: SearchResults,
      passProps: { listings: this.data.listings }
    });
  }
},

This code first checks isLoading and if true, checks that the query results are in. If these conditions are met, you reset isLoading and push SearchResults with this.data.listings passed to it.

It’s generally frowned upon to change state in componentDidUpdate, since this forces another render call. The reason you can get away with this here is that the first forced render call doesn’t actually change the underlying view.

Keep in mind that React makes use of a virtual DOM and only updates the view if the render call changes any part of that view. The second render call triggered by setting isLoading does update the view. That means you only get a single view change when results come in.

Press Cmd+R in the simulator, then tap Go and view your one lonely, yet very satisfying, result:

reactparse-data1

It’s not much fun returning every listing regardless of the search query. It’s time to fix this!

Adding Search Functionality

You may have noticed that your current data schema doesn’t support a search flow since there’s no way to filter on a place name.

There are many ways to set up a sophisticated search, but for the purposes of this tutorial you’re going to keep it simple: you’ll set up a new column that will contain an array of search query terms. If a text search matches one of the terms, you’ll return that row.

Go to your Data Browser and add a column named place_name of type Array, like so:

parse_add_place_name

Click inside the place_name field of the existing row and add the following data:

["campbell","south bay","bay area"]

Head back to your React Native code. Still in SearchPage.js, modify getInitialState to add a new state variable for the query sent to Parse and also modify the default search string displayed:

getInitialState: function() {
  return {
    searchString: 'Bay Area',
    isLoading: false,
    message: '',
    queryName: null,
  };
},

Next, you’ll need to modify observe to check for the existence of a place name query.

Add the following filter to your Parse.Query to look for the place name:

observe: function(props, state) {
  var listingQuery = (new Parse.Query('Listing')).ascending('price');
  if (state.queryName) {
    listingQuery.equalTo('place_name', state.queryName.toLowerCase());
  }
  return state.isLoading ?  { listings: listingQuery } : null;
},

The equalTo filter looks through the values of an array type and returns objects where a match exists. The filter you’ve defined looks at the place_name array and returns Listing objects where the queryName value is contained in the array.

Now, modify _executeQuery to take in a query argument and set the queryName state variable:

_executeQuery: function(nameSearchQuery) {
  this.setState({
    isLoading: true,
    message: '',
    queryName: nameSearchQuery,
  });
},

Then, modify onSearchPressed to pass the search string from the text input:

onSearchPressed: function() {
  this._executeQuery(this.state.searchString);
},

Finally, modify onLocationPressed to pass in null to _executeQuery:

onLocationPressed: function() {
  navigator.geolocation.getCurrentPosition(
    location => {
      this._executeQuery(null);
    },
    error => {
      this.setState({
        message: 'There was a problem with obtaining your locaton: ' + error
      });
    }
  );
},

You do this as you don’t want to execute a place name search when a location query triggers.

In your simulator, press Cmd+R; your application should refresh and you should see the new default search string.

reactparse-bayarea

Tap Go and verify that you get the same results as before.

Now go back to the home page of your app, enter neverland in the search box and tap Go:

reactparse-blank

Uh-oh. Your app pushed the new view with an empty result set. This would be a great time to add some error handling! :]

Update componentDidUpdate to the following implementation:

componentDidUpdate: function(prevProps, prevState) {
  if (prevState.isLoading && (this.pendingQueries().length == 0)) {
    // 1
    this.setState({ isLoading: false });
    // 2
    if (this.queryErrors() !== null) {
      this.setState({ message: 'There was a problem fetching the results' });
    } else 
      // 3
      if (this.data.listings.length == 0) {
      this.setState({ message: 'No search results found' });
    } else {
      // 4
      this.setState({ message: '' });
      this.props.navigator.push({
        title: 'Results',
        component: SearchResults,
        passProps: {listings: this.data.listings}
      });
    }
  }
},

Taking the code step-by-step you’ll see the following:

  1. Here you turn off isLoading to clear out the loading indicator.
  2. Here you check this.queryErrors, which is another method that ParseReact.Mixin exposes. The method returns a non-null object if there are errors; you’ve updated the message to reflect this.
  3. Here you check if there are no results returned; if so, you set the appropriate message.
  4. If there are no errors and there is data, push the results component.

Press Cmd+R and test the empty results case once again. You should now see the relevant message without the empty results component pushed:

reactparse-narnia

Feel free to add more rows to your Listing class in the Parse Data Browser to test additional search queries; you can make use of the sample photos available in the Media folder you downloaded earlier.

Adding Location Queries

Querying locations is really easy to do with Parse, since Parse supports a GeoPoint data type and provides API methods to peform a variety of geo-based queries, such as searching for locations within a certain radius.

Go to your Data Browser and add a column named location of type GeoPoint:

parse_add_column_location

You’ll need to add some location data for your initial row.

Double-click in the location field and add 37.277455 for latitude area and -121.937503 for the longitude:

parse_add_location

Head back to SearchPage.js and modify getInitialState as follows:

getInitialState: function() {
  return {
    searchString: 'Bay Area',
    isLoading: false,
    message: '',
    queryName: null,
    queryGeo: {},
  };
},

This adds a new queryGeo state to hold location data.

Next, modify _executeQuery to take in location data like so:

_executeQuery: function(nameSearchQuery, geoSearchQuery) {
  this.setState({
    isLoading: true,
    message: '',
    queryName: nameSearchQuery,
    queryGeo: geoSearchQuery,
  });
},

Here, you’ve added an additional parameter for the location-based query and then add whatever’s passed in to the current state.

Next, modify onSearchPressed to pass an empty location to _executeQuery:

onSearchPressed: function() {
  this._executeQuery(this.state.searchString, {});
},

The search button is for when you’re searching by place name rather than by location, which means you can just pass in an empty object for the geoSearchQuery.

Modify onLocationPressed to finally make use of this precious location data by passing it on to _executeQuery:

onLocationPressed: function() {
  navigator.geolocation.getCurrentPosition(
    location => {
      this._executeQuery(
        null,
        {
          latitude: location.coords.latitude,
          longitude: location.coords.longitude,
        }
      );
    },
    error => {
      this.setState({
        message: 'There was a problem with obtaining your locaton: ' + error
      });
  });
},

This time, the updated call to _executeQuery passes in null for the search string and actual coordinates for the geoSearchQuery.

Finally, modify observe to add the location-based search filter:

observe: function(props, state) {
  var listingQuery = (new Parse.Query('Listing')).ascending('price');
  if (state.queryName) {
    listingQuery.equalTo('place_name', state.queryName.toLowerCase());
  } else 
  // 1
  if (state.queryGeo && state.queryGeo.latitude &&
      state.queryGeo.longitude) {
    // 2
    var geoPoint = new Parse.GeoPoint({
        latitude: state.queryGeo.latitude,
        longitude: state.queryGeo.longitude,
    });
    // 3
    listingQuery.withinMiles('location', geoPoint, 25);
  }
  return state.isLoading ? { listings: listingQuery } : null;
},

Taking each numbered comment in turn:

  1. Here you check if this is a location query.
  2. Next, you create a Parse.GeoPoint based on the location coordinates.
  3. Finally, you add a filter for locations within 25 miles of the point of interest.

Before you can test the location-based search, you’ll need to specify a location that will yield results.

From the simulator menu, select Debug\Location\Apple to set your simulated location to a spot near Apple headquarters.

In the simulator, press Cmd+R. Tap Location, permit the app to receive location, then verify that you see the expected result:

reactparse-location

Adding More Test Data

The folder you downloaded earlier contains a JSON test data file — Listing.json — that you can import instead of entering your own data.

To import this data go to your Data Browser and perform the following actions:

  1. Click Import.
  2. Drag Listing.json into the upload area.
  3. Make sure the Custom option is selected and click Finish Import.
  4. Dismiss the pop-up.

parse_data_import

You should receive a confirmation email once it’s done; since you’re importing a very small amount of data, this should happen very quickly.

Once the import is complete, you’ll need to fix the image URLs. These will contain incorrect information and you need to upload the photos yourself. Go to all the newly imported rows and, one by one, delete the existing img_url entry, then upload the corresponding photo from the Media folder.

You’ll notice you have a duplicate for the “Grand mansion” property, since you created it manually and it’s also in the import file. Delete one of the copies to keep things clean.

In your simulator, press Cmd+R, click Location and verify that you see the additional results from your imported test data:

reactparse-imported

Where to Go From Here?

You can download the completed project here. Remember to update index.ios.js with your own Parse application and Javascript keys so you connect to your own data set!

You’ve only scratched the surface of what you can do with Parse+React; there’s a whole world out there beyond simply fetching data. You can save data and even use the underlying Parse JavaScript SDK APIs to create users and or add Parse analytics. Check out Parse+React on GitHub for more details.

For more information on Parse itself, check out our Parse Tutorial: Getting Started with Web Backends. If you want to hear more about React Native, have a listen at our podcast episode with Nick Lockwood, who works on the React Native team at Facebook.

If you have comments or questions, feel free to share them in the discussion below!

The post Integrating Parse and React Native for iOS appeared first on Ray Wenderlich.


Engaging the Developer Community with Janie Clayton

$
0
0
Learn how to become a part of the developer community with Janie Clayton!

Learn how to become a part of the developer community with Janie Clayton!

Learn how to become a part of the developer community with Janie Clayton!

[Subscribe in iTunes] [RSS Feed]

Our Sponsor

  • Bushel: A cloud-based management solution for all the Mac, iPhone, iPad, and iPod devices in your workplace.

Links

Contact Us

Where To Go From Here?

We hope you enjoyed this episode of our podcast. Be sure to subscribe in iTunes to get notified when the next episode comes out.

We’d love to hear what you think about the podcast, and any suggestions on what you’d like to hear next season. Feel free to drop a comment here, or email us anytime at podcast@raywenderlich.com.

The post Engaging the Developer Community with Janie Clayton appeared first on Ray Wenderlich.

RWDevCon 2016: Ticket Sales Open in 1 Week!

$
0
0
RWDevCon 2016: Ticket Sales Open in 1 Week!

RWDevCon 2016: Ticket Sales Open in 1 Week!

Earlier this year, we ran a conference focused on high quality hands-on tutorials called RWDevCon.

The conference was a huge hit and got rave reviews, so we are running it again next March!

This is just a quick heads-up that ticket sales for the conference will open up in 1 week, on Wed Aug 12 @ 12:00 Noon EST.

We have a great line-up of speakers this year and will be making some new tutorials based on what you vote for in a conference survey, including new material released this year at WWDC.

Last year the conference sold out very quickly, so if you’re interested in attending, be sure to snag your ticket while you still can.

To get notified when tickets become available, sign up for our RWDevCon newsletter. We hope you at the conference! :]

The post RWDevCon 2016: Ticket Sales Open in 1 Week! appeared first on Ray Wenderlich.

Video Tutorial: What’s New in watchOS 2 Part 5: Watch Connectivity

UIGestureRecognizer Tutorial: Creating Custom Recognizers

$
0
0
Learn how to recognize drawn circles using a custom UIGestureRecognizer

Learn how to recognize circles using a custom UIGestureRecognizer.

Custom gesture recognizers delight users by making apps feel unique and alive. If basic taps, pans, and rotations are the utility and pickup trucks of the iOS world, custom gesture recognizers are the flashy hot rods with custom paint jobs and hydraulics. Read this custom UIGestureRecognizer tutorial and learn all about gesture recognizers!

In this tutorial you’ll take a fun little “find the differences” game and make it interactive by adding a custom circle gesture recognizer to select the non-matching image. Along the way you’ll learn:

  • How to use UIGestureRecognizer subclasses to leverage the provided state machine and callback mechanism to simplify gesture detection.
  • How to fit a circle to a collection of touched points.
  • How to be “fuzzy” in recognizing specific shapes, as drawing with one’s finger is often imprecise.

Note: This gesture recognizer tutorial assumes you already have general knowledge of how gesture recognizers work and how to use a pre-defined gesture recognizer in your app. To get up to speed, read through this UIGestureRecognizer on this site.

Getting Started

MatchItUp is a simple game that shows four images to the user, three alike and one image that is slightly different than the others. The user’s job is to identify the odd one out by drawing a circle over it with a finger:

The MatchItUp! game

The MatchItUp! game

Download and open the starter project for this tutorial here.

Build and run your app; you’ll see four images, but you can’t select the odd one out yet. Your task is to add a custom gesture recognizer to this game. The custom gesture recognizer will detect when the user draws a circle around an image. If they draw around the odd one out, they win!

Adding a Custom Gesture Recognizer

Go to File\New\File… and select the iOS\Source\Cocoa Touch Class template to create a class called CircleGestureRecognizer as a subclass of UIGestureRecognizer. Make sure Swift is selected. Then click Next and then Create.

In order for a gesture recognizer to work, it has to be attached to a view in the responder chain. When a user taps on the screen, the touch event is forwarded along the stack of views with each view’s gesture recognizers receiving a chance to handle those touches.

Open GameViewController.swift and add an instance variable for your gesture recognizer:

var circleRecognizer: CircleGestureRecognizer!

Next, add the following code to the bottom of viewDidLoad():

circleRecognizer = CircleGestureRecognizer(target: self, action: "circled:")
view.addGestureRecognizer(circleRecognizer)

This creates the gesture recognizer and adds it to the main view.

But wait… if the goal is to have the user circle the differing image, why not add the recognizer to each image view instead of the main view?

That’s a great question — glad you asked! :]

When building gesture recognizers, you must compensate for the imprecise nature of the user interface. If you’ve ever tried to sign your name inside a little box on a touchscreen, you’ll know what I mean! :]

When you put the recognizer on the whole view, it’s more forgiving to users who start or continue a gesture slightly outside the bounds of the image’s box. Eventually, your recognizer will have a tolerance setting to help those who can’t draw a perfect circle.

Build and run your app; even though you’ve created a subclass of UIGestureRecognizer, you haven’t added any code yet so it will recognize… exactly zero gestures! To make it useful, your gesture recognizer needs to implement a gesture recognizer state machine.

The Gesture Recognizer State Machine

The simplest gesture in a user’s repertoire is a tap; the user puts a finger down and then lifts it up. There are two methods called on the gesture recognizer for this event: touchesBegan(_:withEvent:) and touchesEnded(_:withEvent:).

In the case of a simple tap gesture, these methods correspond to the gesture recognizer states .Began and .Ended:

Basic Two-State Machine

A Basic Tap Recognizer

To see this in action, you’ll implement this state machine in the CircleGestureRecognizer class.

First things first! Add the following import at the top of CircleGestureRecognizer.swift:

import UIKit.UIGestureRecognizerSubclass

UIGestureRecognizerSubclass is a public header in UIKit, but isn’t included in the umbrella UIKit header. Importing this is necessary since you will need it to update the state property which, otherwise, would be a read-only property in UIGestureRecognizer.

Now add the following code to the same class:

override func touchesBegan(touches: Set<NSObject>!, withEvent event: UIEvent!) {
  super.touchesBegan(touches, withEvent: event)
  state = .Began
}
 
override func touchesEnded(touches: Set<NSObject>!, withEvent event: UIEvent!) {
  super.touchesEnded(touches, withEvent: event)
  state = .Ended
}

If you ran the app now and tapped the screen, the app would crash since you are not handling the gesture yet.

Add the following to the class in GameViewController.swift:

func circled(c: CircleGestureRecognizer) {
  if c.state == .Ended {
    let center = c.locationInView(view)
    findCircledView(center)
  }
}

The gesture recognizer’s target-action is fired when there’s a state change in the gesture recognizer. When the fingers touch the screen, touchesBegan(_:withEvent) fires. The gesture recognizer sets its state to .Began, resulting in an automatic call to the target-action. When the fingers are removed, touchesEnded(_:withEvent) sets the state to .Ended, calling the target-action again.

Earlier, when you set up the gesture recognizer, you made the target-action the circled(_:) method. The implementation of this method uses the provided findCircledView(_:) to check which image was tapped.

Build and run your app; tap one of the images to select it. The game checks your response and moves you to the next round:

Tap an image to choose it

Tap an image to choose it

Handling Multiple Touches

So you have a working tap gesture recognizer, right? Not so fast, fancy fingers! :] Note that the methods are named with “touches” — plural. Gesture recognizers can detect multi-finger gestures, but the game’s circle recognizer is meant to just recognize a single-figure gesture.

Almost got away with it...

You’ll need a check that there’s only a single finger involved in the touch.

Open CircleGestureRecognizer.swift and modify touchesBegan(_:) so that the touches Set parameter will allow one UITouch per finger:

override func touchesBegan(touches: Set<NSObject>!, withEvent event: UIEvent!) {
  super.touchesBegan(touches, withEvent: event)
  if touches.count != 1 {
    state = .Failed
  }
  state = .Began
}

Here you’ve introduced a third state: .Failed. .Ended indicates that the gesture completed successfully , while the .Failed state indicates that the user’s gesture wasn’t what you expected.

It’s important that you quickly move the state machine to a terminal state, such as .Failed, so that other gesture recognizers waiting in the wings get a chance to interpret the touches instead.

Gesture State Machine 2

Build and run your app again; try some multi-finger taps and some single-finger taps. This time, only a single-finger tap should work to select the image.

Detecting a Circle

“But, hold on a second,” you cry. “A tap does not a circle make!”

Well, if you wanna get all technical about it, a single point is a circle with a radius of 0. But that’s not what’s intended here; the user has to actually circle the image for the selection to count.

To find the circle, you’ll have to collect the points that the user moves his or her finger over and see if they form a circle.

This sounds like a perfect job for a collection.

Add the following instance variable to the top of the CircleGestureRecognizer class:

private var touchedPoints = [CGPoint]() // point history

You’ll use this to track the points the user touched.

Now add the following method to the CircleGestureRecognizer class:

override func touchesMoved(touches: Set<NSObject>!, withEvent event: UIEvent!) {
  super.touchesMoved(touches, withEvent: event)
 
  // 1
  if state == .Failed {
    return
  }
 
  // 2
  let window = view?.window
  if let touches = touches as? Set<UITouch>, loc = touches.first?.locationInView(window) {
    // 3
    touchedPoints.append(loc)
    // 4
    state = .Changed
  }
}

touchesMoved(_:withEvent:) fires whenever the user moves a finger after the initial touch event. Taking each numbered section in turn:

  1. Apple recommends you first check that the gesture hasn’t already failed; if it has, don’t continue to process the other touches. Touch events are buffered and processed serially in the event queue. If a the user moves the touch fast enough, there could be touches pending and processed after the gesture has already failed.
  2. To make the math easy, convert the tracked points to window coordinates. This makes it easier to track touches that don’t line up within any particular view, so the user can make a circle outside the bounds of the image, and have it still count towards selecting that image.
  3. Add the points to the array.
  4. Update the state to .Changed. This has the side effect of calling the target action as well.

.Changed is the next state to add to your state machine. The gesture recognizer should transition to .Changed every time the touches change; that is, whenever the finger is moved, added, or removed.

Here’s your new state machine with the .Changed state added:

state machine with .Changed added

Now that you have all the points, how are you going to figure out if the points form a circle?

all the points

Checking the Points

To start, add the following variables to the top of the class in CircleGestureRecognizer.swift:

var fitResult = CircleResult() // information about how circle-like is the path
var tolerance: CGFloat = 0.2 // circle wiggle room
var isCircle = false

These will help you determine if the points are within tolerance for a circle.

Update touchesEnded(_:withEvent:) so that it looks like the code below:

override func touchesEnded(touches: Set<NSObject>!, withEvent event: UIEvent!) {
  super.touchesEnded(touches, withEvent: event)
 
  // now that the user has stopped touching, figure out if the path was a circle
  fitResult = fitCircle(touchedPoints)
 
  isCircle = fitResult.error <= tolerance
  state = isCircle ? .Ended : .Failed
}

This cheats a little bit as it uses a pre-made circle detector. You can take a peek at CircleFit.swift now, but I’ll describe its inner workings in just a bit. The main take-away is that the detector tries to fit the traced points to a circle. The error value is how far the path deviated from a true circle, and the tolerance is there because you can’t expect users to draw a perfect circle. If the error is within tolerance, the recognizer moves to the .Ended state; if the circle is out of tolerance then move to .Failed.

If you were to build and run right now, the game wouldn’t quite work because the gesture recognizer is still treating the gesture like a tap.

Go back to GameViewController.swift, and change circled(_:) as follows:

func circled(c: CircleGestureRecognizer) {
  if c.state == .Ended {
    findCircledView(c.fitResult.center)
  }
}

This uses the calculated center of the circle to figure out which view was circled, instead of just getting the last point touched.

Build and run your app; try your hand at the game — pun quite intended. It’s not easy to get the app to recognize your circle, is it? What’s remaining is to bridge the difference between mathematical theory and the real world of imprecise circles.

nowork

Drawing As You Go

Since it’s tough to tell exactly what’s going on, you’ll draw the path the user traces with their finger. iOS already comes with most of what you need in Core Graphics.

Add the following to the instance variable declarations in CircleGestureRecognizer.swift:

var path = CGPathCreateMutable() // running CGPath - helps with drawing

This provides a mutable CGPath object for drawing the path.

Add the following to the bottom of touchesBegan(_:withEvent:):

let window = view?.window
if let touches = touches as? Set<UITouch>, loc = touches.first?.locationInView(window) {
  CGPathMoveToPoint(path, nil, loc.x, loc.y) // start the path
}

This makes sure the path starts out in the same place that the touches do.

Now add the following to touchesMoved(_:withEvent:), just below touchedPoints.append(loc) in the if let block at the bottom:

CGPathAddLineToPoint(path, nil, loc.x, loc.y)

Whenever the touch moves, you add the new point to the path by way of a line. Don’t worry about the straight line part; since the points should be very close together, this will wind up looking quite smooth once you draw the path.

In order to see the path, it has to be drawn in the game’s view. There’s already a view in the hierarchy of CircleDrawView.

To show the path in this view, add the following to the bottom of circled(_:) in GameViewController.swift:

if c.state == .Began {
  circlerDrawer.clear()
}
if c.state == .Changed {
  circlerDrawer.updatePath(c.path)
}

This clears the view when a gesture starts, and draws the path as a yellow line that follows the user’s finger.

Build and run your app; try drawing on the screen to see how it works:

Now draws the path

whoah

Cool! But did you notice anything funny when you drew a second or third circle?

iOS Simulator Screen Shot 1 Jun 2015 21.09.19

Even though you added a call to circlerDrawer.clear() when moving into the .Began state, it appears that each time a gesture is made, the previous ones are not cleared. That can only mean one thing: it’s time for a new action in your gesture recognizer state machine: reset().

The Reset Action

You’ll call reset() after touchesEnded and before touchesBegan. This gives the gesture recognizer a chance to clean up its state and start fresh.

Add the following method to CircleGestureRecognizer.swift:

override func reset() {
  super.reset()
  touchedPoints.removeAll(keepCapacity: true)
  path = CGPathCreateMutable()
  isCircle = false
  state = .Possible
}

Here you clear the collection of touch points and set path to a new path. Also, you reset the state to .Possible, which means either that the touches haven’t matched or that the gesture has failed.

Your new state machine looks like the following:

State machine with .Possible

Build and run your app again; this time, the view (and the gesture recognizer state) will be cleared between each touch.

The Math

What’s going on inside CircleFit, and why does it sometimes recognize weird shapes like lines, C’s, or S’s as circles?

Just a little line is recognized as a circle

Just a little line is being recognized as a circle

Remember from high school that the equation for a circle is  \sqrt{x^2 + y^2} = r^2 . If the user traced a circle, then all the points touched will fit this equation exactly:

Circle

Or more precisely, since the recognizer wants to figure out any circle, and not just one centered on the origin, the equation is  \sqrt{(x - x_c)^2 + (y - y_c)^2} = r^2 . When the gesture ends, all you have is the collection of points, which is just the x’s and y’s. What’s left to figure out is the center (x_c, y_c) and the radius (r):

Circle centered at xc, yc

Circle centered at xc, yc

There are a few methods for figuring this out, but this tutorial uses a method adapted from Nikolai Chernov’s C++ implementation of a Taubin fit. It works as follows:

  1. First, you average all the points together to guess at the centroid of a circle (the mean of all the x and y coordinates). If it’s a true circle, then the centroid of all the points will be the center of the circle. If the points aren’t a true circle, then the calculated center will be somewhat off:

    The center of the circle is guessed at the start to be the mean of all the points.

    The center of the circle is guessed at the start to be the mean of all the points.

  2. Next you calculate the moment. Imagine there is a mass at the center of the circle. The moment is a measure of how much each point in the touched path pulls at that mass.
  3. You then plug the moment value into a characteristic polynomial, the roots of which are used to find the “true center”. The moment is also used to calculate the radius. The mathematical theory is beyond the scope of the tutorial, but the main idea is that this is mathematical way to solve for  \sqrt{(x - x_c)^2 + (y - y_c)^2} = r^2 where x_c, y_c, and r should be the same value for all the points.
  4. Finally, you calcuate a root-mean-square error as the fit. This is a measure of how much the actual points deviate from a circle:

    The blue bars represent the error, or difference between the points and red circle fit.

    The blue bars represent the error, or difference between the points and red circle fit.

Professor Rageface

And they say math is hard! Pshaw!

Does your brain hurt yet? The TLDR is that the algorithm tries to fit a circle at the center of all the points, and each point pulls out the radius according to how far it is from the computed center. Then you calculate the error value according to how far removed each point is from the calculated circle. If that error is small, then you assume you have a circle.

The algorithm trips up when the points either form symmetrical round shapes, such as C’s, and S’s where the calculated error is small, or form short arcs or lines where the points are assumed be a small arc on a much, much larger circle.

Most of the points are on a circle, and the other points are symmetric enough to "cancel each other out."

Most of the points are on a circle, and the other points are symmetric enough to “cancel each other out.”


Here the line fits to a circle, since the points look like an arc.

Here the line fits to a circle, since the points look like an arc.

Debugging the Draw

So to figure out what’s going on with the weird gestures, you can draw the fit circle on the screen.

In CircleDrawView.swift, set the value of drawDebug to true:

var drawDebug = true // set to true show additional information about the fit

This draws some additional info about the fit circle to the screen.

Update the view with the fit details by adding the following clause to circled(_:) in GameViewController.swift:

if c.state == .Ended || c.state == .Failed || c.state == .Cancelled {
  circlerDrawer.updateFit(c.fitResult, madeCircle: c.isCircle)
}

Build and run your app again; draw a circular path and when you lift your finger, the fit circle will be drawn on the screen, green if the fit was successful and red if the fit failed:.

not_a_circlebad_circle

You’ll learn what the other squares do in just a bit.

Recognizing A Gesture, Not a Path

Going back to the misidentified shapes, how should these not-really-circular gestures be handled? The fit is obviously wrong in two cases: when the shape has points in the middle of the circle, and when the shape isn’t a complete circle.

Checking Inside

With false positive shapes like S’s, swirls, figure-8’s, etc. the fit has a low error, but is obviously not a circle. This is the difference between mathematical approximation and getting a usable gesture. One obvious fix is to exclude any paths with points in the middle of the circle.

You can solve this by checking the touched points to see if any are inside the fit circle.

Add the following helper method to CircleGestureRecognizer.swift:

private func anyPointsInTheMiddle() -> Bool {
  // 1
  let fitInnerRadius = fitResult.radius / sqrt(2) * tolerance
  // 2
  let innerBox = CGRect(
    x: fitResult.center.x - fitInnerRadius,
    y: fitResult.center.y - fitInnerRadius,
    width: 2 * fitInnerRadius,
    height: 2 * fitInnerRadius)
 
  // 3
  var hasInside = false
  for point in touchedPoints {
    if innerBox.contains(point) {
      hasInside = true
      break
    }
  }
 
  return hasInside
}

This checks an exclusion zone which is a smaller rectangle that fits inside the circle. If there are any points in this square then the gesture fails. The above code does the following:

  1. Calculates a smaller exclusion zone. The tolerance variable will provide enough space for a reasonable, but messy circle, but still have enough room to exclude any obviously non-circle shapes with points in the middle.
  2. To simplify the amount of code required, this constructs a smaller square centered on the circle.
  3. This loops over the points and checks if the point is contained within innerBox.

Next, modify touchesEnded(_:withEvent:) to add the following check to the isCircle criteria:

override func touchesEnded(touches: Set<NSObject>!, withEvent event: UIEvent!) {
  super.touchesEnded(touches, withEvent: event)
 
  // now that the user has stopped touching, figure out if the path was a circle
  fitResult = fitCircle(touchedPoints)
 
  // make sure there are no points in the middle of the circle
  let hasInside = anyPointsInTheMiddle()
 
  isCircle = fitResult.error <= tolerance && !hasInside
 
  state = isCircle ? .Ended : .Failed
}

This uses the check to see if there are any points in the middle of the circle. If so, then the circle is not detected.

Build and run. Try drawing an ‘S’ shape. You should find that you can’t now. Great! :]

Handling Small Arcs

Now that you’ve handled the round, non-circular shapes, what about those pesky short arcs that look like they’re part of a huge circle? If you look at the debug drawing, the size discrepancy between the path (black box) and the fit circle is huge:

small_arc

Paths that you want to recognize as a circle should at least approximate the size of the circle itself:

matching_circle

Fixing this should be as easy as comparing the size of the path against the size of the fit circle.

Add the following helper method to CircleGestureRecognizer.swift:

private func calculateBoundingOverlap() -> CGFloat {
  // 1
  let fitBoundingBox = CGRect(
    x: fitResult.center.x - fitResult.radius,
    y: fitResult.center.y - fitResult.radius,
    width: 2 * fitResult.radius,
    height: 2 * fitResult.radius)
  let pathBoundingBox = CGPathGetBoundingBox(path)
 
  // 2
  let overlapRect = fitBoundingBox.rectByIntersecting(pathBoundingBox)
 
  // 3
  let overlapRectArea = overlapRect.width * overlapRect.height
  let circleBoxArea = fitBoundingBox.height * fitBoundingBox.width
 
  let percentOverlap = overlapRectArea / circleBoxArea
  return percentOverlap
}

This calculates how much the user’s path overlaps the fit circle:

  1. Find the bounding box of the circle fit and the user’s path. This uses CGPathGetBoundingBox to handle the tricky math, since the touch points were also captured as part of the CGMutablePath path variable.
  2. Calculate the rectangle where the two paths overlap using the rectByIntersecting method on CGRect
  3. Figure out what percentage the two bounding boxes overlap as a percentage of area. This percentage will be in the 80%-100% for a good circle gesture. In the case of the short arc shape, it will be very, very tiny!

Next, modify the isCircle check in touchesEnded(_:withEvent:) as follows:

let percentOverlap = calculateBoundingOverlap()
isCircle = fitResult.error <= tolerance && !hasInside && percentOverlap > (1-tolerance)

Build and run your app again; only reasonable circles should pass the test. Do your worst to fool it! :]

detect_small_arc_as_bad

awhyeah

Handling the Cancelled State

Did you notice the check for .Cancelled above in the debug drawing section? Touches are cancelled when a system alert comes up, or the gesture recognizer is explicitly cancelled through a delegate or by disabling it mid-touch. There’s not much to be done for the circle recognizer other than to update the state machine. Add the following method to CircleGestureRecognizer.swift:

override func touchesCancelled(touches: Set<NSObject>!, withEvent event: UIEvent!) {
  super.touchesCancelled(touches, withEvent: event)
  state = .Cancelled // forward the cancel state
}

This simply sets the state to .Cancelled when the touches are cancelled.

Handling Other Touches

With the game running, tap the New Set. Notice anything? That’s right, the button doesn’t work! That’s because the gesture recognizer is sucking up all the taps!

no_button_worky

There are a few ways to get the gesture recognizer to interact properly with the other controls. The primary way is to override the default behavior by using a UIGestureRecognizerDelegate.

Open GameViewController.swift. In viewDidLoad(_:) set the delegate of the gesture recognizer to self:

circleRecognizer.delegate = self

Now add the following extension at the bottom of the file, to implement the delegate method:

extension GameViewController: UIGestureRecognizerDelegate {
  func gestureRecognizer(gestureRecognizer: UIGestureRecognizer, shouldReceiveTouch touch: UITouch) -> Bool {
    // allow button press
    return !(touch.view is UIButton)
  }
}

This prevents the gesture recognizer from recognizing touches over a button; this lets the touch proceed down to the button itself. There are several delegate methods, and these can be used to customize where and how a gesture recognizer works in the view hierarchy.

Build and run your app again; tap the button and it should work properly now.

Spit-Shining the Game

All that’s left is to clean up the interaction and make this a well-polished game.

First, you need to prevent the user from interacting with the view after an image has been circled. Otherwise, the path will continue to update while waiting for the new set of images.

Open GameViewController.swift. Add the following code to the bottom of selectImageViewAtIndex(_:):

circleRecognizer.enabled = false

Now re-enable your gesture recognizer at the bottom of startNewSet(_:), so the next round can proceed:

circleRecognizer.enabled = true

Next, add the following to the .Began clause in circled(_:):

if c.state == .Began {
  circlerDrawer.clear()
  goToNextTimer?.invalidate()
}

This adds a timer to automatically clear the path after a short delay so that the user is encouraged to try again.

Also in circled(_:), add the following code to the final state check:

if c.state == .Ended || c.state == .Failed || c.state == .Cancelled {
  circlerDrawer.updateFit(c.fitResult, madeCircle: c.isCircle)
  goToNextTimer = NSTimer.scheduledTimerWithTimeInterval(afterGuessTimeout, target: self, selector: "timerFired:", userInfo: nil, repeats: false)
}

This sets up a timer to fire a short time after the gesture recogniser either ends, fails or is cancelled.

Finally, add the following method to GameViewController:

func timerFired(timer: NSTimer) {
  circlerDrawer.clear()
}

This clears the circle after the timer fires, so that the user is tempted to draw another to have another attempt.

Build and run your app; If the gesture doesn’t approximate a circle, you’ll see that the path clears automatically after a short delay.

Where to Go From Here?

You can download the completed project from this tutorial here.

You’ve built a simple, yet powerful circle gesture recognizer for your game. You can extend these concepts further to recognize other drawn shapes, or even customize the circle fit algorithm to fit other needs.

For more details, check out Apple’s documentation on Gesture Recognizers.

If you have any questions or comments about this tutorial, feel free to join the forum discussion below!

The post UIGestureRecognizer Tutorial: Creating Custom Recognizers appeared first on Ray Wenderlich.

Video Tutorial: What’s New in watchOS 2 Part 6: HealthKit

Viewing all 4370 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>