In this video, you'll learn how to use Charles Proxy to investigate third party apps and even debug your own.
The post Screencast: Charles Proxy: Troubleshooting Your App appeared first on Ray Wenderlich.
In this video, you'll learn how to use Charles Proxy to investigate third party apps and even debug your own.
The post Screencast: Charles Proxy: Troubleshooting Your App appeared first on Ray Wenderlich.
In this video, you’ll learn about Model-View-ViewModel (MVVM) which you’ll use to further combat massive view controllers.
The post Video Tutorial: iOS Design Patterns Part 4: MVVM appeared first on Ray Wenderlich.
The Swift Algorithm Club is a popular open source project to implement popular algorithms and data structures in Swift, with over 13,000 stars on GitHub.
We periodically give status updates with how things are going with the project. There’s been quite a few updates to the repo this month, including 3 brand new contributions:
Let’s dig in!
In the traditional coin change problem, you have to find all the different ways to change some money given a set of coins (i.e. 1 cent, 2 cents, 5 cents, 10 cents etc.).
For example, if you have the value of 4 Euro cents, that can be changed in three possible ways:
The minimum coin change problem is a variation of the generic coin change problem, where you need to find the option that returns the least number of coins.
You can learn how to implement this in Swift here:
This algorithm was invented in 1956 by Edsger W. Dijkstra, and is one of the most effective algorithms in finding the shortest paths from one point to all other points in a graph.
The best example is a road network. If you want to find the shortest path from two points, let’s say, from your house to your workplace, then you would use Dijkstra’s algorithm.
You can learn how to implement this in Swift here:
A minimum spanning tree (MST) of a connected undirected weighted graph has a subset of the edges from the original graph that connects all the vertices together, without any cycles and with the minimum possible total edge weight. There can be more than one MSTs of a graph.
There are two popular algorithms to calculate MST of a graph – Kruskal’s algorithm and Prim’s algorithm. Both algorithms have a total time complexity of O(ElogE) where E is the number of edges from the original graph.
You can learn how to implement this in Swift here:
The Swift Algorithm Club is always looking for new members. Whether you’re here to learn or here to contribute, we’re happy to have you around.
To learn more about the Swift Algorithm Club, check out our introductory article. We hope to see you at the club! :]
The post Swift Algorithm Club: June Digest 2017 appeared first on Ray Wenderlich.
Learn about the multicast closure delegate pattern, a spin-off pattern from delegate, which will prepare you for auto re-login authentication (in the next video).
The post Video Tutorial: iOS Design Patterns Part 5: Multicast Closure Delegate appeared first on Ray Wenderlich.
The motivational run-tracking app Runkeeper has over 40 million users! This tutorial will show you how to make an app like Runkeeper that will teach you the following:
The result? Your new app — MoonRunner — with badges based on planets and moons in our Solar System!
Before you run headlong into this tutorial, you should be familiar with Storyboards and Core Data. Check out the linked tutorials if you feel you need a refresher.
This How to Make an App Like Runkeeper tutorial also makes use of iOS 10’s new Measurement and MeasurementFormatter capabilities. See the linked screencasts if you need more detail.
There’s so much to talk about that this tutorial comes in two parts. The first segment focuses on recording the run data and rendering the color-coded map. The second segment introduces the badge system.
Download the starter project. It includes all of the project files and assets that you will need to complete this tutorial.
Take a few minutes to explore the project. Main.storyboard already contains the UI. CoreDataStack.swift removes Apple’s template Core Data code from AppDelegate
and puts it in its own class. Assets.xcassets contains the images and sounds you will use.
MoonRunner’s use of Core Data is fairly simple, using only two entities: Run
and Location
.
Open MoonRunner.xcdatamodeld and create two entities: Run and Location. Configure Run
with the following properties:
A Run
has three attributes: distance
, duration
and timestamp
. It has a single relationship, locations
, that connects it to the Location
entity.
Now, set up Location
with the following properties:
A Location
also has three attributes: latitude
, longitude
and timestamp
and a single relationship, run
.
Select the Run entity and verify that its locations
relationship Inverse property now says “run”.
Select the locations
relationship, and set the Type to To Many, and check the Ordered box in the Data Model Inspector’s Relationship pane.
Finally, verify that both Run
and Location
entities’ Codegen property is set to Class Definition in the Entity pane of the Data Model Inspector (this is the default).
Build your project so that Xcode can generate the necessary Swift definitions for your Core Data model.
Open RunDetailsViewController.swift and add the following line right before viewDidLoad()
:
var run: Run!
Next, add the following function after viewDidLoad()
:
private func configureView() {
}
Finally, inside viewDidLoad()
after the call to super.viewDidLoad()
, add a call to configureView()
.
configureView()
This sets up the bare minimum necessary to complete navigation in the app.
Open NewRunViewController.swift and add the following line right before viewDidLoad()
:
private var run: Run?
Next, add the following new methods:
private func startRun() {
launchPromptStackView.isHidden = true
dataStackView.isHidden = false
startButton.isHidden = true
stopButton.isHidden = false
}
private func stopRun() {
launchPromptStackView.isHidden = false
dataStackView.isHidden = true
startButton.isHidden = false
stopButton.isHidden = true
}
The stop button and the UIStackView
containing the labels that describe the run are hidden in the storyboard. These routines switch the UI between its “not running” and “during run” modes.
In startTapped()
, add a call to startRun()
.
startRun()
At the end of the file, after the closing brace, add the following extension
:
extension NewRunViewController: SegueHandlerType {
enum SegueIdentifier: String {
case details = "RunDetailsViewController"
}
override func prepare(for segue: UIStoryboardSegue, sender: Any?) {
switch segueIdentifier(for: segue) {
case .details:
let destination = segue.destination as! RunDetailsViewController
destination.run = run
}
}
}
Apple’s interface for storyboard segues is what is commonly known as “stringly typed”. The segue identifier is a string, and there is no error checking. Using the power of Swift protocols and enums, and a little bit of pixie dust in StoryboardSupport.swift, you can avoid much of the pain of such a “stringly typed” interface.
Next, add the following lines to stopTapped()
:
let alertController = UIAlertController(title: "End run?",
message: "Do you wish to end your run?",
preferredStyle: .actionSheet)
alertController.addAction(UIAlertAction(title: "Cancel", style: .cancel))
alertController.addAction(UIAlertAction(title: "Save", style: .default) { _ in
self.stopRun()
self.performSegue(withIdentifier: .details, sender: nil)
})
alertController.addAction(UIAlertAction(title: "Discard", style: .destructive) { _ in
self.stopRun()
_ = self.navigationController?.popToRootViewController(animated: true)
})
present(alertController, animated: true)
When the user presses the stop button, you should let them decide whether to save, discard, or continue the run. You use a UIAlertController
to prompt the user and get their response.
Build and run. Press the New Run button and then the Start button. Verify that the UI changes to the “running mode”:
Press the Stop button and verify that pressing Save takes you to the “Details” screen.
MoonRunner[5400:226999] [VKDefault] /BuildRoot/Library/Caches/com.apple.xbs/Sources/VectorKit_Sim/VectorKit-1295.30.5.4.13/src/MDFlyoverAvailability.mm:66: Missing latitude in trigger specification
This is normal and does not indicate an error on your part.
iOS 10 introduced new capabilities that make it far easier to work with and display units of measurement. Runners tend to think of their progress in terms of pace (time per unit distance) which is the inverse of speed (distance per unit time). You must extend UnitSpeed
to support the concept of pace.
Add a new Swift file to your project named UnitExtensions.swift. Add the following after the import
statement:
class UnitConverterPace: UnitConverter {
private let coefficient: Double
init(coefficient: Double) {
self.coefficient = coefficient
}
override func baseUnitValue(fromValue value: Double) -> Double {
return reciprocal(value * coefficient)
}
override func value(fromBaseUnitValue baseUnitValue: Double) -> Double {
return reciprocal(baseUnitValue * coefficient)
}
private func reciprocal(_ value: Double) -> Double {
guard value != 0 else { return 0 }
return 1.0 / value
}
}
Before you can extend UnitSpeed
to convert to and from a pace measurement, you must create a UnitConverter
that can handle the math. Subclassing UnitConverter
requires that you implement baseUnitValue(fromValue:)
and value(fromBaseUnitValue:)
.
Now, add this code to the end of the file:
extension UnitSpeed {
class var secondsPerMeter: UnitSpeed {
return UnitSpeed(symbol: "sec/m", converter: UnitConverterPace(coefficient: 1))
}
class var minutesPerKilometer: UnitSpeed {
return UnitSpeed(symbol: "min/km", converter: UnitConverterPace(coefficient: 60.0 / 1000.0))
}
class var minutesPerMile: UnitSpeed {
return UnitSpeed(symbol: "min/mi", converter: UnitConverterPace(coefficient: 60.0 / 1609.34))
}
}
UnitSpeed
is one of the many types of Unit
s provided in Foundation. UnitSpeed
‘s default unit is “meters/second”. Your extension
allows the speed to be expressed in terms of minutes/km or minutes/mile.
You need a uniform way to display quantities such as distance, time, pace and date throughout MoonRunner. MeasurementFormatter
and DateFormatter
make this simple.
Add a new Swift file to your project named FormatDisplay.swift. Add the following after the import
statement:
struct FormatDisplay {
static func distance(_ distance: Double) -> String {
let distanceMeasurement = Measurement(value: distance, unit: UnitLength.meters)
return FormatDisplay.distance(distanceMeasurement)
}
static func distance(_ distance: Measurement<UnitLength>) -> String {
let formatter = MeasurementFormatter()
return formatter.string(from: distance)
}
static func time(_ seconds: Int) -> String {
let formatter = DateComponentsFormatter()
formatter.allowedUnits = [.hour, .minute, .second]
formatter.unitsStyle = .positional
formatter.zeroFormattingBehavior = .pad
return formatter.string(from: TimeInterval(seconds))!
}
static func pace(distance: Measurement<UnitLength>, seconds: Int, outputUnit: UnitSpeed) -> String {
let formatter = MeasurementFormatter()
formatter.unitOptions = [.providedUnit] // 1
let speedMagnitude = seconds != 0 ? distance.value / Double(seconds) : 0
let speed = Measurement(value: speedMagnitude, unit: UnitSpeed.metersPerSecond)
return formatter.string(from: speed.converted(to: outputUnit))
}
static func date(_ timestamp: Date?) -> String {
guard let timestamp = timestamp as Date? else { return "" }
let formatter = DateFormatter()
formatter.dateStyle = .medium
return formatter.string(from: timestamp)
}
}
These simple functions should be mostly self-explanatory. In pace(distance:seconds:outputUnit:)
, you must set the MeasurementFormatter
‘s unitOptions
to .providedUnits
to prevent it from displaying the localized measurement for speed (e.g. mph or kph).
It’s almost time to start running. But first, the app needs to know where it is. For this, you will use Core Location. It is important that there be only one instance of CLLocationManager
in your app and that it not be inadvertently deleted.
To accomplish this, add another Swift file to your project named LocationManager.swift. Replace the contents of the file with:
import CoreLocation
class LocationManager {
static let shared = CLLocationManager()
private init() { }
}
You need to make a couple of project level changes before you can begin tracking the user’s location.
First, click on the project at the top of the Project Navigator.
Select the Capabilities tab and switch Background Modes to ON. Check Location updates.
Next, open Info.plist. Click the + next to Information Property List. From the resulting drop-down list, select Privacy – Location When In Use Usage Description and set its value to MoonRunner needs access to your location in order to record and track your run!
Before your app can use location information, it must get permission from the user. Open AppDelegate.swift and add the following to application(_:didFinishLaunchingWithOptions:)
just before return true
:
let locationManager = LocationManager.shared
locationManager.requestWhenInUseAuthorization()
Open NewRunViewController.swift and import CoreLocation
:
import CoreLocation
Next, add the following after the run
property:
private let locationManager = LocationManager.shared
private var seconds = 0
private var timer: Timer?
private var distance = Measurement(value: 0, unit: UnitLength.meters)
private var locationList: [CLLocation] = []
Taking it line-by-line:
locationManager
is the object you’ll use to start and stop location services.seconds
tracks the duration of the run, in seconds.timer
will fire each second and update the UI accordingly.distance
holds the cumulative distance of the run.locationList
is an array to hold all the CLLocation
objects collected during the run.Add the following after viewDidLoad()
:
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
timer?.invalidate()
locationManager.stopUpdatingLocation()
}
This ensures that location updates, a big battery consumer, as well as the timer are stopped when the user navigates away from the view.
Add the following two methods:
func eachSecond() {
seconds += 1
updateDisplay()
}
private func updateDisplay() {
let formattedDistance = FormatDisplay.distance(distance)
let formattedTime = FormatDisplay.time(seconds)
let formattedPace = FormatDisplay.pace(distance: distance,
seconds: seconds,
outputUnit: UnitSpeed.minutesPerMile)
distanceLabel.text = "Distance: \(formattedDistance)"
timeLabel.text = "Time: \(formattedTime)"
paceLabel.text = "Pace: \(formattedPace)"
}
eachSecond()
will be called once per second by a Timer
that you will set up shortly.
updateDisplay()
uses the fancy formatting capabilities you built in FormatDisplay.swift to update the UI with the details of the current run.
Core Location reports location updates via its CLLocationManagerDelegate
. Add this now in an extension
at the end of the file:
extension NewRunViewController: CLLocationManagerDelegate {
func locationManager(_ manager: CLLocationManager, didUpdateLocations locations: [CLLocation]) {
for newLocation in locations {
let howRecent = newLocation.timestamp.timeIntervalSinceNow
guard newLocation.horizontalAccuracy < 20 && abs(howRecent) < 10 else { continue }
if let lastLocation = locationList.last {
let delta = newLocation.distance(from: lastLocation)
distance = distance + Measurement(value: delta, unit: UnitLength.meters)
}
locationList.append(newLocation)
}
}
}
This delegate method will be called each time Core Location updates the user's location, providing an array of CLLocation
objects. Usually this array contains only one object but, if there are more, they are ordered by time with the most recent location last.
A CLLocation
contains some great information, including the latitude, longitude, and timestamp of the reading.
Before blindly accepting the reading, it’s worth checking the accuracy of the data. If the device isn’t confident it has a reading within 20 meters of the user’s actual location, it’s best to keep it out of your dataset. It's also important to ensure that the data is recent.
If the CLLocation
passes the checks, the distance between it and the most recent saved point is added to the cumulative distance of the run. distance(from:)
is very convenient here, taking into account some surprisingly difficult math involving the Earth’s curvature, and returning a distance in meters.
Lastly, the location object itself is added to a growing array of locations.
Now add the following method back in NewRunViewController
(not the extension
):
private func startLocationUpdates() {
locationManager.delegate = self
locationManager.activityType = .fitness
locationManager.distanceFilter = 10
locationManager.startUpdatingLocation()
}
You make this class the delegate for Core Location so that you can receive and process location updates.
The activityType
parameter is made specifically for an app like this. It helps the device to intelligently save power throughout the user’s run, such as when they stop to cross a road.
Lastly, you set a distanceFilter
of 10 meters. As opposed to the activityType
, this parameter doesn’t affect battery life. The activityType
is for readings and the distanceFilter
is for the reporting of readings.
As you’ll see after doing a test run later, the location readings can deviate a little from a straight line. A higher distanceFilter
could reduce the zigging and zagging and, thus, give you a more accurate line. Unfortunately, a filter that's too high will pixelate your readings. That’s why 10 meters is a good balance.
Finally, you tell Core Location to start getting location updates!
To actually begin the run, add these lines to the end of startRun()
:
seconds = 0
distance = Measurement(value: 0, unit: UnitLength.meters)
locationList.removeAll()
updateDisplay()
timer = Timer.scheduledTimer(withTimeInterval: 1.0, repeats: true) { _ in
self.eachSecond()
}
startLocationUpdates()
This resets all of the values to be updated during the run to their initial state, starts the Timer
that will fire each second, and begins collecting location updates.
At some point, your user will get tired and stop running. You have the UI in place to do that, but you also need to save the run's data or your user will be very unhappy to see all of that effort go to waste.
Add the following method to NewRunViewController
in NewRunViewController.swift:
private func saveRun() {
let newRun = Run(context: CoreDataStack.context)
newRun.distance = distance.value
newRun.duration = Int16(seconds)
newRun.timestamp = Date()
for location in locationList {
let locationObject = Location(context: CoreDataStack.context)
locationObject.timestamp = location.timestamp
locationObject.latitude = location.coordinate.latitude
locationObject.longitude = location.coordinate.longitude
newRun.addToLocations(locationObject)
}
CoreDataStack.saveContext()
run = newRun
}
If you've used Core Data prior to Swift 3, you will notice the power and simplicity of iOS 10's Core Data support. You create a new Run
object and initialize its values as with any other Swift object. You then create a Location
object for each CLLocation
you recorded, saving only the relevant data. Finally, you add each of these new Location
s to the Run
using the automatically generated addToLocations(_:)
.
When the user ends the run, you want to stop tracking location. Add the following to the end of stopRun()
:
locationManager.stopUpdatingLocation()
Finally, in stopTapped()
locate the UIAlertAction
titled "Save" and add a call to self.saveRun()
so that it looks like this:
alertController.addAction(UIAlertAction(title: "Save", style: .default) { _ in
self.stopRun()
self.saveRun() // ADD THIS LINE!
self.performSegue(withIdentifier: .details, sender: nil)
})
While you should always test your app on a real device before releasing it, you don't have to go for a run each time you want to test MoonRunner.
Build and run in the simulator. Before pressing the New Run button, select Debug\Location\City Run from the Simulator menu.
Now, press New Run, then press Start and verify that the simulator begins its workout.
After all of that hard work, it's time to show the user where they went and how well they did.
Open RunDetailsViewController.swift and replace configureView()
with:
private func configureView() {
let distance = Measurement(value: run.distance, unit: UnitLength.meters)
let seconds = Int(run.duration)
let formattedDistance = FormatDisplay.distance(distance)
let formattedDate = FormatDisplay.date(run.timestamp)
let formattedTime = FormatDisplay.time(seconds)
let formattedPace = FormatDisplay.pace(distance: distance,
seconds: seconds,
outputUnit: UnitSpeed.minutesPerMile)
distanceLabel.text = "Distance: \(formattedDistance)"
dateLabel.text = formattedDate
timeLabel.text = "Time: \(formattedTime)"
paceLabel.text = "Pace: \(formattedPace)"
}
This formats all of the details of the run and sets them to display.
Rendering the run on the map requires a bit more work. There are three steps to this:
MKOverlay
that describes the line to be drawn.Add the following method:
private func mapRegion() -> MKCoordinateRegion? {
guard
let locations = run.locations,
locations.count > 0
else {
return nil
}
let latitudes = locations.map { location -> Double in
let location = location as! Location
return location.latitude
}
let longitudes = locations.map { location -> Double in
let location = location as! Location
return location.longitude
}
let maxLat = latitudes.max()!
let minLat = latitudes.min()!
let maxLong = longitudes.max()!
let minLong = longitudes.min()!
let center = CLLocationCoordinate2D(latitude: (minLat + maxLat) / 2,
longitude: (minLong + maxLong) / 2)
let span = MKCoordinateSpan(latitudeDelta: (maxLat - minLat) * 1.3,
longitudeDelta: (maxLong - minLong) * 1.3)
return MKCoordinateRegion(center: center, span: span)
}
An MKCoordinateRegion
represents the display region for the map. You define it by supplying a center point and a span that defines horizontal and vertical ranges. It's important to add a little padding so that map edges don't crowd the route.
At the end of the file, after the closing brace, add the following extension
:
extension RunDetailsViewController: MKMapViewDelegate {
func mapView(_ mapView: MKMapView, rendererFor overlay: MKOverlay) -> MKOverlayRenderer {
guard let polyline = overlay as? MKPolyline else {
return MKOverlayRenderer(overlay: overlay)
}
let renderer = MKPolylineRenderer(polyline: polyline)
renderer.strokeColor = .black
renderer.lineWidth = 3
return renderer
}
}
Each time MapKit wants to display an overlay, it asks its delegate for something to render that overlay. For now, if the overlay is an MKPolyine
(a collection of line segments), you return MapKit's MKPolylineRenderer
configured to draw in black. You'll make this more colorful shortly.
Finally, you need to create your overlay. Add the following method to RunDetailsViewController
(not the extension
):
private func polyLine() -> MKPolyline {
guard let locations = run.locations else {
return MKPolyline()
}
let coords: [CLLocationCoordinate2D] = locations.map { location in
let location = location as! Location
return CLLocationCoordinate2D(latitude: location.latitude, longitude: location.longitude)
}
return MKPolyline(coordinates: coords, count: coords.count)
}
Here, you turn each recorded location from the run into a CLLocationCoordinate2D
as required by MKPolyline
.
Now it's time to glue all these bits together. Add the following method:
private func loadMap() {
guard
let locations = run.locations,
locations.count > 0,
let region = mapRegion()
else {
let alert = UIAlertController(title: "Error",
message: "Sorry, this run has no locations saved",
preferredStyle: .alert)
alert.addAction(UIAlertAction(title: "OK", style: .cancel))
present(alert, animated: true)
return
}
mapView.setRegion(region, animated: true)
mapView.add(polyLine())
}
Here, you make sure there is something to draw. Then you set the map region and add the overlay.
Now, add the following at the end of configureView()
.
loadMap()
Build and run. When you save your completed run, you should now see a map of the run!
ERROR /BuildRoot/Library/Caches/com.apple.xbs/Sources/VectorKit_Sim/VectorKit-1230.34.9.30.27/GeoGL/GeoGL/GLCoreContext.cpp 1763: InfoLog SolidRibbonShader:
ERROR /BuildRoot/Library/Caches/com.apple.xbs/Sources/VectorKit_Sim/VectorKit-1230.34.9.30.27/GeoGL/GeoGL/GLCoreContext.cpp 1764: WARNING: Output of vertex shader 'v_gradient' not read by fragment shader
/BuildRoot/Library/Caches/com.apple.xbs/Sources/VectorKit_Sim/VectorKit-1295.30.5.4.13/src/MDFlyoverAvailability.mm:66: Missing latitude in trigger specification
On the simulator, this is normal. The messages come from MapKit and do not indicate an error on your part.
The app is pretty awesome already, but the map could be much better if you used color to highlight differences in pace.
Add a new Cocoa Touch Class file, and name it MulticolorPolyline. Make it a subclass of MKPolyline.
Open MulticolorPolyline.swift and import MapKit:
import MapKit
Add a color property to the class:
var color = UIColor.black
Wow, that was easy! :] Now, for the more difficult stuff, open RunDetailsViewController.swift and add the following method:
private func segmentColor(speed: Double, midSpeed: Double, slowestSpeed: Double, fastestSpeed: Double) -> UIColor {
enum BaseColors {
static let r_red: CGFloat = 1
static let r_green: CGFloat = 20 / 255
static let r_blue: CGFloat = 44 / 255
static let y_red: CGFloat = 1
static let y_green: CGFloat = 215 / 255
static let y_blue: CGFloat = 0
static let g_red: CGFloat = 0
static let g_green: CGFloat = 146 / 255
static let g_blue: CGFloat = 78 / 255
}
let red, green, blue: CGFloat
if speed < midSpeed {
let ratio = CGFloat((speed - slowestSpeed) / (midSpeed - slowestSpeed))
red = BaseColors.r_red + ratio * (BaseColors.y_red - BaseColors.r_red)
green = BaseColors.r_green + ratio * (BaseColors.y_green - BaseColors.r_green)
blue = BaseColors.r_blue + ratio * (BaseColors.y_blue - BaseColors.r_blue)
} else {
let ratio = CGFloat((speed - midSpeed) / (fastestSpeed - midSpeed))
red = BaseColors.y_red + ratio * (BaseColors.g_red - BaseColors.y_red)
green = BaseColors.y_green + ratio * (BaseColors.g_green - BaseColors.y_green)
blue = BaseColors.y_blue + ratio * (BaseColors.g_blue - BaseColors.y_blue)
}
return UIColor(red: red, green: green, blue: blue, alpha: 1)
}
Here, you define the recipes for your base red, yellow and green colors. Then you create a blended color based on where the specified speed falls in the range from slowest to fastest.
Replace your polyLine()
implementation with the following:
private func polyLine() -> [MulticolorPolyline] {
// 1
let locations = run.locations?.array as! [Location]
var coordinates: [(CLLocation, CLLocation)] = []
var speeds: [Double] = []
var minSpeed = Double.greatestFiniteMagnitude
var maxSpeed = 0.0
// 2
for (first, second) in zip(locations, locations.dropFirst()) {
let start = CLLocation(latitude: first.latitude, longitude: first.longitude)
let end = CLLocation(latitude: second.latitude, longitude: second.longitude)
coordinates.append((start, end))
//3
let distance = end.distance(from: start)
let time = second.timestamp!.timeIntervalSince(first.timestamp! as Date)
let speed = time > 0 ? distance / time : 0
speeds.append(speed)
minSpeed = min(minSpeed, speed)
maxSpeed = max(maxSpeed, speed)
}
//4
let midSpeed = speeds.reduce(0, +) / Double(speeds.count)
//5
var segments: [MulticolorPolyline] = []
for ((start, end), speed) in zip(coordinates, speeds) {
let coords = [start.coordinate, end.coordinate]
let segment = MulticolorPolyline(coordinates: coords, count: 2)
segment.color = segmentColor(speed: speed,
midSpeed: midSpeed,
slowestSpeed: minSpeed,
fastestSpeed: maxSpeed)
segments.append(segment)
}
return segments
}
Here's what the new version does:
CLLocation
object and save them in pairs.MulticolorPolyline
. Set its color.You will now see an error on the line mapView.add(polyLine())
in loadMap()
. Replace that line with:
mapView.addOverlays(polyLine())
Now replace mapView(_:rendererFor:)
in the MKMapViewDelegate
extension
with:
func mapView(_ mapView: MKMapView, rendererFor overlay: MKOverlay) -> MKOverlayRenderer {
guard let polyline = overlay as? MulticolorPolyline else {
return MKOverlayRenderer(overlay: overlay)
}
let renderer = MKPolylineRenderer(polyline: polyline)
renderer.strokeColor = polyline.color
renderer.lineWidth = 3
return renderer
}
This is very similar to the previous version. It now expects each overlay to be a MulticolorPolyline
and uses the embedded color to render the segment.
Build and run! Let the simulator take a quick jog and then check out the fancy colored map at the end!
The post-run map is stunning, but how about having a map during the run?
The storyboard is set up using UIStackView
s to make it easy to add one!
First, open NewRunViewController.swift and import MapKit:
import MapKit
Now, open Main.storyboard and find the New Run View Controller Scene. Be sure the Document Outline is visible. If not, press the button outlined in red below:
Drag a UIView
into the Document Outline and place it between the Top Stack View and the Button Stack View. Make sure it appears between them and not inside one of them. Double-click it and rename it to Map Container View.
In the Attributes Inspector, check Hidden under Drawing.
In the Document Outline, Control-drag from the Map Container View to the Top Stack View and select Equal Widths from the pop-up.
Drag an MKMapView
into the Map Container View. Press the Add New Constraints button (A.K.A the "Tie Fighter button") and set all 4 constraints to 0. Make sure Constrain to margins is not checked. Click Add 4 Constraints.
With Map View selected in the Document Outline, open the Size Inspector (View\Utilities\Show Size Inspector). Double-click on the constraint Bottom Space to: Superview.
Change the priority to High (750).
In the Document Outline, Control-drag from Map View to New Run View Controller and select delegate.
Open the Assistant Editor, ensure it is showing NewRunViewController.swift and Control-drag from the Map View to create an outlet named mapView. Control-drag from Map Container View and create an outlet called mapContainerView.
Close the Assistant Editor and open NewRunViewController.swift.
Add the following to the top of startRun()
:
mapContainerView.isHidden = false
mapView.removeOverlays(mapView.overlays)
To the top of stopRun()
add the following:
mapContainerView.isHidden = true
Now, you need an MKMapViewDelegate
to provide a renderer for the line. Add the following implementation in an extension
at the bottom of the file:
extension NewRunViewController: MKMapViewDelegate {
func mapView(_ mapView: MKMapView, rendererFor overlay: MKOverlay) -> MKOverlayRenderer {
guard let polyline = overlay as? MKPolyline else {
return MKOverlayRenderer(overlay: overlay)
}
let renderer = MKPolylineRenderer(polyline: polyline)
renderer.strokeColor = .blue
renderer.lineWidth = 3
return renderer
}
}
This is just like the delegate you wrote in RunDetailsViewController.swift except that the line is blue.
Finally, you just need to add the line segment overlay and update the map region to keep it focused on the area of your run. Add the following to locationManager(_:didUpdateLocations:)
after the line distance = distance + Measurement(value: delta, unit: UnitLength.meters)
:
let coordinates = [lastLocation.coordinate, newLocation.coordinate]
mapView.add(MKPolyline(coordinates: coordinates, count: 2))
let region = MKCoordinateRegionMakeWithDistance(newLocation.coordinate, 500, 500)
mapView.setRegion(region, animated: true)
Build and run and start a new run. You will see your new map updating in real time!
Click here to download the the project up to this point.
You may have noticed that the user's pace always displays in "min/mi", even if your locale causes the distance to be displayed in meters (or km). Find a way to use the locale to choose between .minutesPerMile
and .minutesPerKilometer
in the places you call FormatDisplay.pace(distance:seconds:outputUnit:)
.
Continue to part two of the How to Make an App Like Runkeeper tutorial where you will introduce an achievement badge system.
As always, I look forward to your comments and questions! :]
The post How To Make an App Like Runkeeper: Part 1 appeared first on Ray Wenderlich.
This is the second and final part of a tutorial that teaches you how to create an app like Runkeeper, complete with color-coded maps and badges!
In part one of the tutorial, you created an app that:
The app, in its current state, is great for recording and displaying data, but it needs a bit more spark to give users that extra bit of motivation.
In this section, you’ll complete the demo MoonRunner app by implementing a badge system that embodies the concept that fitness is a fun and progress-based achievement. Here’s how it works:
If you completed part one of the tutorial, you can continue on with your completed project from that tutorial. If you’re starting here, download this starter project.
Regardless of which file you use, you’ll notice your project contains a number of images in the asset catalog and a file named badges.txt. Open badges.txt now. You can see it contains a large JSON array of badge objects. Each object contains:
The badges go all the way from 0 meters — hey, you have to start somewhere — up to the length of a full marathon.
The first task is to parse the JSON text into an array of badges. Add a new Swift file to your project, name it Badge.swift, and add the following implementation to it:
struct Badge {
let name: String
let imageName: String
let information: String
let distance: Double
init?(from dictionary: [String: String]) {
guard
let name = dictionary["name"],
let imageName = dictionary["imageName"],
let information = dictionary["information"],
let distanceString = dictionary["distance"],
let distance = Double(distanceString)
else {
return nil
}
self.name = name
self.imageName = imageName
self.information = information
self.distance = distance
}
}
This defines the Badge
structure and provides a failable initializer to extract the information from the JSON object.
Add the following property to the structure to read and parse the JSON:
static let allBadges: [Badge] = {
guard let fileURL = Bundle.main.url(forResource: "badges", withExtension: "txt") else {
fatalError("No badges.txt file found")
}
do {
let jsonData = try Data(contentsOf: fileURL, options: .mappedIfSafe)
let jsonResult = try JSONSerialization.jsonObject(with: jsonData) as! [[String: String]]
return jsonResult.flatMap(Badge.init)
} catch {
fatalError("Cannot decode badges.txt")
}
}()
You use basic JSON deserialization to extract the data from the file and flatMap
to discard any structures which fail to initialize. allBadges
is declared static
so that the expensive parsing operation happens only once.
You will need to be able to match Badge
s later, so add the following extension
to the end of the file:
extension Badge: Equatable {
static func ==(lhs: Badge, rhs: Badge) -> Bool {
return lhs.name == rhs.name
}
}
Now that you have created the Badge
structure, you’ll need a structure to store when a badge was earned. This structure will associate a Badge
with the various Run
objects, if any, where the user achieved versions of this badge.
Add a new Swift file to your project, name it BadgeStatus.swift, and add the following implentation to it:
struct BadgeStatus {
let badge: Badge
let earned: Run?
let silver: Run?
let gold: Run?
let best: Run?
static let silverMultiplier = 1.05
static let goldMultiplier = 1.1
}
This defines the BadgeStatus
structure and the multipliers that determine how much a user’s time must improve to earn a silver or gold badge. Now add the following method to the structure:
static func badgesEarned(runs: [Run]) -> [BadgeStatus] {
return Badge.allBadges.map { badge in
var earned: Run?
var silver: Run?
var gold: Run?
var best: Run?
for run in runs where run.distance > badge.distance {
if earned == nil {
earned = run
}
let earnedSpeed = earned!.distance / Double(earned!.duration)
let runSpeed = run.distance / Double(run.duration)
if silver == nil && runSpeed > earnedSpeed * silverMultiplier {
silver = run
}
if gold == nil && runSpeed > earnedSpeed * goldMultiplier {
gold = run
}
if let existingBest = best {
let bestSpeed = existingBest.distance / Double(existingBest.duration)
if runSpeed > bestSpeed {
best = run
}
} else {
best = run
}
}
return BadgeStatus(badge: badge, earned: earned, silver: silver, gold: gold, best: best)
}
}
This method compares each of the user’s runs to the distance requirements for each badge, making the associations and returning an array of BadgeStatus
values for each badge earned.
The first time a user earns a badge, that run’s speed becomes the reference used to determine if subsequent runs have improved enough to qualify for the silver or gold versions.
Lastly, the method keeps track of the user’s fastest run to each badge’s distance.
Now that you have all of the logic written to award badges, it’s time to show them to the user. The starter project already has the necessary UI defined. You will display the list of badges in a UITableViewController
. To do this, you first need to define the custom table view cell that displays a badge.
Add a new Swift file to your project and name it BadgeCell.swift. Replace the contents of the file with:
import UIKit
class BadgeCell: UITableViewCell {
@IBOutlet weak var badgeImageView: UIImageView!
@IBOutlet weak var silverImageView: UIImageView!
@IBOutlet weak var goldImageView: UIImageView!
@IBOutlet weak var nameLabel: UILabel!
@IBOutlet weak var earnedLabel: UILabel!
var status: BadgeStatus! {
didSet {
configure()
}
}
}
These are the outlets you will need to display information about a badge. You also declare a status
variable which is the model for the cell.
Next, add a configure()
method to the cell, right under the status
variable:
private let redLabel = #colorLiteral(red: 1, green: 0.07843137255, blue: 0.1725490196, alpha: 1)
private let greenLabel = #colorLiteral(red: 0, green: 0.5725490196, blue: 0.3058823529, alpha: 1)
private let badgeRotation = CGAffineTransform(rotationAngle: .pi / 8)
private func configure() {
silverImageView.isHidden = status.silver == nil
goldImageView.isHidden = status.gold == nil
if let earned = status.earned {
nameLabel.text = status.badge.name
nameLabel.textColor = greenLabel
let dateEarned = FormatDisplay.date(earned.timestamp)
earnedLabel.text = "Earned: \(dateEarned)"
earnedLabel.textColor = greenLabel
badgeImageView.image = UIImage(named: status.badge.imageName)
silverImageView.transform = badgeRotation
goldImageView.transform = badgeRotation
isUserInteractionEnabled = true
accessoryType = .disclosureIndicator
} else {
nameLabel.text = "?????"
nameLabel.textColor = redLabel
let formattedDistance = FormatDisplay.distance(status.badge.distance)
earnedLabel.text = "Run \(formattedDistance) to earn"
earnedLabel.textColor = redLabel
badgeImageView.image = nil
isUserInteractionEnabled = false
accessoryType = .none
selectionStyle = .none
}
}
This straightforward method configures the table view cell based on the BadgeStatus
set into it.
If you copy and paste the code, you will notice that Xcode changes the #colorLiteral
s to swatches. If you’re typing by hand, start typing the words Color literal, select the Xcode completion and double-click on the resulting swatch.
This will display a simple color picker. Click the Other… button.
This will bring up the system color picker. To match the colors used in the sample project, use the Hex Color # field and enter FF142C for red and 00924E for green.
Open Main.storyboard and connect your outlets to the BadgeCell in the Badges Table View Controller Scene:
Now that your table cell is defined, it is time to create the table view controller. Add a new Swift file to your project and name it BadgesTableViewController.swift. Replace the import section to import UIKit
and CoreData
:
import UIKit
import CoreData
Now, add the class definition:
class BadgesTableViewController: UITableViewController {
var statusList: [BadgeStatus]!
override func viewDidLoad() {
super.viewDidLoad()
statusList = BadgeStatus.badgesEarned(runs: getRuns())
}
private func getRuns() -> [Run] {
let fetchRequest: NSFetchRequest<Run> = Run.fetchRequest()
let sortDescriptor = NSSortDescriptor(key: #keyPath(Run.timestamp), ascending: true)
fetchRequest.sortDescriptors = [sortDescriptor]
do {
return try CoreDataStack.context.fetch(fetchRequest)
} catch {
return []
}
}
}
When the view loads, you ask Core Data for a list of all completed runs, sorted by date, and then use this to build the list of badges earned.
Next, add the UITableViewDataSource
methods in an extension
:
extension BadgesTableViewController {
override func tableView(_ tableView: UITableView, numberOfRowsInSection section: Int) -> Int {
return statusList.count
}
override func tableView(_ tableView: UITableView, cellForRowAt indexPath: IndexPath) -> UITableViewCell {
let cell: BadgeCell = tableView.dequeueReusableCell(for: indexPath)
cell.status = statusList[indexPath.row]
return cell
}
}
These are the standard UITableViewDataSource
methods required by all UITableViewController
s, returning the number of rows and the configured cells to the table. Just as in part 1, you are reducing “stringly typed” code by dequeuing the cell via a generic method defined in StoryboardSupport.swift.
Build and run to check out your new badges! You should see something like this:
The last view controller for MoonRunner is the one that shows the details of a badge. Add a new Swift file to your project and name it BadgeDetailsViewController.swift. Replace the contents of the file with the following:
import UIKit
class BadgeDetailsViewController: UIViewController {
@IBOutlet weak var badgeImageView: UIImageView!
@IBOutlet weak var nameLabel: UILabel!
@IBOutlet weak var distanceLabel: UILabel!
@IBOutlet weak var earnedLabel: UILabel!
@IBOutlet weak var bestLabel: UILabel!
@IBOutlet weak var silverLabel: UILabel!
@IBOutlet weak var goldLabel: UILabel!
@IBOutlet weak var silverImageView: UIImageView!
@IBOutlet weak var goldImageView: UIImageView!
var status: BadgeStatus!
}
This declares all of the outlets you will need to control the UI and the BadgeStatus
that is the model for this view.
Next, add your viewDidLoad()
:
override func viewDidLoad() {
super.viewDidLoad()
let badgeRotation = CGAffineTransform(rotationAngle: .pi / 8)
badgeImageView.image = UIImage(named: status.badge.imageName)
nameLabel.text = status.badge.name
distanceLabel.text = FormatDisplay.distance(status.badge.distance)
let earnedDate = FormatDisplay.date(status.earned?.timestamp)
earnedLabel.text = "Reached on \(earnedDate)"
let bestDistance = Measurement(value: status.best!.distance, unit: UnitLength.meters)
let bestPace = FormatDisplay.pace(distance: bestDistance,
seconds: Int(status.best!.duration),
outputUnit: UnitSpeed.minutesPerMile)
let bestDate = FormatDisplay.date(status.earned?.timestamp)
bestLabel.text = "Best: \(bestPace), \(bestDate)"
let earnedDistance = Measurement(value: status.earned!.distance, unit: UnitLength.meters)
let earnedDuration = Int(status.earned!.duration)
}
This sets up the labels in the detail view from the BadgeStatus
information. Now, you need to set up the gold and silver badges.
Add the following code to the end of viewDidLoad()
:
if let silver = status.silver {
silverImageView.transform = badgeRotation
silverImageView.alpha = 1
let silverDate = FormatDisplay.date(silver.timestamp)
silverLabel.text = "Earned on \(silverDate)"
} else {
silverImageView.alpha = 0
let silverDistance = earnedDistance * BadgeStatus.silverMultiplier
let pace = FormatDisplay.pace(distance: silverDistance,
seconds: earnedDuration,
outputUnit: UnitSpeed.minutesPerMile)
silverLabel.text = "Pace < \(pace) for silver!"
}
if let gold = status.gold {
goldImageView.transform = badgeRotation
goldImageView.alpha = 1
let goldDate = FormatDisplay.date(gold.timestamp)
goldLabel.text = "Earned on \(goldDate)"
} else {
goldImageView.alpha = 0
let goldDistance = earnedDistance * BadgeStatus.goldMultiplier
let pace = FormatDisplay.pace(distance: goldDistance,
seconds: earnedDuration,
outputUnit: UnitSpeed.minutesPerMile)
goldLabel.text = "Pace < \(pace) for gold!"
}
The gold and silver image views are hidden when necessary by setting their alphas to 0. This works around an interaction between nested UIStackView
s and Auto Layout.
Finally, add the following method:
@IBAction func infoButtonTapped() {
let alert = UIAlertController(title: status.badge.name,
message: status.badge.information,
preferredStyle: .alert)
alert.addAction(UIAlertAction(title: "OK", style: .cancel))
present(alert, animated: true)
}
This will be invoked when the info button is pressed and will show a pop-up with the badge's information.
Open Main.storyboard. Connect the outlets of BadgeDetailsViewController
:
Connect the action infoButtonTapped()
to the info button. Finally, Select the Table View in the Badges Table View Controller Scene.
Check the User Interaction Enabled checkbox in the Attributes Inspector:
Open BadgesTableViewController.swift and add the following extension
:
extension BadgesTableViewController: SegueHandlerType {
enum SegueIdentifier: String {
case details = "BadgeDetailsViewController"
}
override func prepare(for segue: UIStoryboardSegue, sender: Any?) {
switch segueIdentifier(for: segue) {
case .details:
let destination = segue.destination as! BadgeDetailsViewController
let indexPath = tableView.indexPathForSelectedRow!
destination.status = statusList[indexPath.row]
}
}
override func shouldPerformSegue(withIdentifier identifier: String, sender: Any?) -> Bool {
guard let segue = SegueIdentifier(rawValue: identifier) else { return false }
switch segue {
case .details:
guard let cell = sender as? UITableViewCell else { return false }
return cell.accessoryType == .disclosureIndicator
}
}
}
This takes care of passing a BadgeStatus
to BadgeDetailsViewController
when the user taps a badge in the table.
isUserInteractionEnabled
to true
after the cell is configured and before it is displayed. As a result, you must implement shouldPerformSegue(withIdentifier:sender:)
to prevent accessing badge details for unearned badges. If later versions of iOS 11 correct this error, this method can be dropped.Build and run. Check out your new badges' details!
Now that you have a cool new badge system, you need to update the UI of the existing app to incorporate it. Before you can do that, you need a couple of utility methods to determine the most recently earned badge and the next badge to earn for a given distance.
Open Badge.swift and add these methods:
static func best(for distance: Double) -> Badge {
return allBadges.filter { $0.distance < distance }.last ?? allBadges.first!
}
static func next(for distance: Double) -> Badge {
return allBadges.filter { distance < $0.distance }.first ?? allBadges.last!
}
Each of these methods filters the list of badges depending on whether they have been earned or are, as yet, unearned.
Now, open Main.storyboard. Find the Button Stack View in the New Run View Controller Scene. Drag a UIImageView
and a UILabel
into the Document Outline. Make sure they are at the top of Button Stack View:
Select both of these new views and select Editor\Embed In\Stack View. Change the resulting Stack View's properties as follows:
Set the Image View's Content Mode to Aspect Fit.
Change the Label's properties as follows:
Use your favorite Assistant Editor technique to connect outlets from the new Stack View, Image View and Label, named as follows:
@IBOutlet weak var badgeStackView: UIStackView!
@IBOutlet weak var badgeImageView: UIImageView!
@IBOutlet weak var badgeInfoLabel: UILabel!
Badge Stack View
in Main.storyboard. Then add the following line to viewDidLoad()
in NewRunViewController.swift:
badgeStackView.isHidden = true // required to work around behavior change in Xcode 9 beta 1
With luck, this problem will be resolved in a future release of Xcode 9.
Open NewRunViewController.swift and import AVFoundation:
import AVFoundation
Now, add the following properties:
private var upcomingBadge: Badge!
private let successSound: AVAudioPlayer = {
guard let successSound = NSDataAsset(name: "success") else {
return AVAudioPlayer()
}
return try! AVAudioPlayer(data: successSound.data)
}()
successSound
is created as an audio player for the "success sound" that will be played each time a new badge is earned.
Next, find updateDisplay()
and add:
let distanceRemaining = upcomingBadge.distance - distance.value
let formattedDistanceRemaining = FormatDisplay.distance(distanceRemaining)
badgeInfoLabel.text = "\(formattedDistanceRemaining) until \(upcomingBadge.name)"
This will keep the user up-to-date about the next badge to be earned.
In startRun()
, before the call to updateDisplay()
, add:
badgeStackView.isHidden = false
upcomingBadge = Badge.next(for: 0)
badgeImageView.image = UIImage(named: upcomingBadge.imageName)
This shows the initial badge to earn.
In stopRun()
add:
badgeStackView.isHidden = true
Just like the other views, all of the badge info needs to be hidden between runs.
Add the following new method:
private func checkNextBadge() {
let nextBadge = Badge.next(for: distance.value)
if upcomingBadge != nextBadge {
badgeImageView.image = UIImage(named: nextBadge.imageName)
upcomingBadge = nextBadge
successSound.play()
AudioServicesPlaySystemSound(kSystemSoundID_Vibrate)
}
}
This detects when a badge has been achieved, updates the UI to show the next badge, and plays a success sound to celebrate completing a badge.
In eachSecond()
add a call to checkNextBadge()
before the call to updateDisplay()
:
checkNextBadge()
Build and run to watch the label update as the simulator goes for a run. Listen for the sound when it passes a new badge!
[aqme] 254: AQDefaultDevice (188): skipping input stream 0 0 0x0
On the simulator, this is normal. The messages come from AVFoundation and do not indicate an error on your part.
Also, if you don't want to wait around to test out the badges, you can always switch to a different location mode in the Simulator's Debug\Location menu. Don't worry, we won't tell anyone. :]
After a run has finished, it would be nice to provide your users with the ability to see the last badge that they earned.
Open Main.storyboard and find the Run Details View Controller Scene. Drag a UIImageView
on top of the Map View. Control-drag from the Image View to the Map View. On the resulting pop-up, hold down Shift and select Top, Bottom, Leading and Trailing. Click Add Constraints to pin the edges of the Image View to those of the Map View.
Xcode will add the constraints, each with a value of 0, which is exactly what you want. Currently, however, the Image View doesn't completely cover the Map View so you see the orange warning lines. Click the Update Frames button (outlined in red below) to resize the Image View.
Drag a UIButton
on top of the Image View. Delete the Button's Title and set its Image value to info.
Control-drag from the button to the Image View. On the resulting pop-up, hold down Shift and select Bottom and Trailing. Click Add Constraints to pin the button to the bottom right corner of the image view.
In the Size Inspector, Edit each constraint and set its value to -8.
Click the Update Frames button again to fix the Button's size and position.
Select the Image View and set its Content Mode to Aspect Fit and its Alpha to 0.
Select the Button and set its Alpha to 0.
Drag a UISwitch
and a UILabel
into the bottom right corner of the view.
Select the Switch and press the Add New Contraints button (the "Tie Fighter" button). Add constraints for Right, Bottom and Left with a value of 8. Make sure the Left constraint is relative to the Label. Select Add 3 Constraints.
Set the Switch Value to Off.
Control-drag from the Switch to the Label. On the resulting pop-up, select Center Vertically.
Select the Label, set its Title to SPACE MODE and it's Color to White Color.
In the Document Outline, Control-drag from the Switch to the Stack View. Select Vertical Spacing from the resulting pop-up.
In the Size Inspector for the Switch, Edit the constraint for Top Space to: Stack View. Set its relation to ≥ and its value to 8.
Whew! You deserve a badge after all of that layout work! :]
Open RunDetailsViewController.swift in the Assistant Editor and connect outlets for the Image View and Info Button as follows:
@IBOutlet weak var badgeImageView: UIImageView!
@IBOutlet weak var badgeInfoButton: UIButton!
Add the following action routine for the Switch and connect it:
@IBAction func displayModeToggled(_ sender: UISwitch) {
UIView.animate(withDuration: 0.2) {
self.badgeImageView.alpha = sender.isOn ? 1 : 0
self.badgeInfoButton.alpha = sender.isOn ? 1 : 0
self.mapView.alpha = sender.isOn ? 0 : 1
}
}
When the switch value changes, you animate the visibilities of the Image View, the Info Button and the Map View by changing their alpha
values.
Now add the action routine for the Info Button and connect it:
@IBAction func infoButtonTapped() {
let badge = Badge.best(for: run.distance)
let alert = UIAlertController(title: badge.name,
message: badge.information,
preferredStyle: .alert)
alert.addAction(UIAlertAction(title: "OK", style: .cancel))
present(alert, animated: true)
}
This is exactly the same as the button handler you implemented in BadgeDetailsViewController.swift.
The final step is to add the following to the end of configureView()
:
let badge = Badge.best(for: run.distance)
badgeImageView.image = UIImage(named: badge.imageName)
You find the last badge the user earned on the run and set it to display.
Build and run. Send the simulator on a run, save the details and try out your new "Space Mode"!
The post-run map already helps you remember your route and even identify specific areas where your speed was lower. Now you'll add a feature that shows exactly where each badge was earned.
MapKit uses annotations to display point data such as this. To create annotations, you need:
MKAnnotation
that provides a coordinate describing the annotation's location.MKAnnotationView
that displays the information associated with an annotation.To implement this, you will:
BadgeAnnotation
that conforms to MKAnnotation
.BadgeAnnotation
objects and add them to the map.mapView(_:viewFor:)
to create the MKAnnotationView
s.Add a new Swift file to your project and name it BadgeAnnotation.swift. Replace its contents with:
import MapKit
class BadgeAnnotation: MKPointAnnotation {
let imageName: String
init(imageName: String) {
self.imageName = imageName
super.init()
}
}
MKPointAnnotation
conforms to MKAnnotation
so all you need is a way to pass the image name to the rendering system.
Open RunDetailsViewController.swift and add this new method:
private func annotations() -> [BadgeAnnotation] {
var annotations: [BadgeAnnotation] = []
let badgesEarned = Badge.allBadges.filter { $0.distance < run.distance }
var badgeIterator = badgesEarned.makeIterator()
var nextBadge = badgeIterator.next()
let locations = run.locations?.array as! [Location]
var distance = 0.0
for (first, second) in zip(locations, locations.dropFirst()) {
guard let badge = nextBadge else { break }
let start = CLLocation(latitude: first.latitude, longitude: first.longitude)
let end = CLLocation(latitude: second.latitude, longitude: second.longitude)
distance += end.distance(from: start)
if distance >= badge.distance {
let badgeAnnotation = BadgeAnnotation(imageName: badge.imageName)
badgeAnnotation.coordinate = end.coordinate
badgeAnnotation.title = badge.name
badgeAnnotation.subtitle = FormatDisplay.distance(badge.distance)
annotations.append(badgeAnnotation)
nextBadge = badgeIterator.next()
}
}
return annotations
}
This creates an array of BadgeAnnotation
objects, one for each badge earned on the run.
Add the following at the end of loadMap()
:
mapView.addAnnotations(annotations())
This puts the annotations on the map.
Finally, add this method to the MKMapViewDelegate
extension
:
func mapView(_ mapView: MKMapView, viewFor annotation: MKAnnotation) -> MKAnnotationView? {
guard let annotation = annotation as? BadgeAnnotation else { return nil }
let reuseID = "checkpoint"
var annotationView = mapView.dequeueReusableAnnotationView(withIdentifier: reuseID)
if annotationView == nil {
annotationView = MKAnnotationView(annotation: annotation, reuseIdentifier: reuseID)
annotationView?.image = #imageLiteral(resourceName: "mapPin")
annotationView?.canShowCallout = true
}
annotationView?.annotation = annotation
let badgeImageView = UIImageView(frame: CGRect(x: 0, y: 0, width: 50, height: 50))
badgeImageView.image = UIImage(named: annotation.imageName)
badgeImageView.contentMode = .scaleAspectFit
annotationView?.leftCalloutAccessoryView = badgeImageView
return annotationView
}
Here, you create an MKAnnotationView
for each annotation and configure it to display the badge's image.
Build and run. Send the simulator on a run and save the run at the end. The map will now have annotations for each badge earned. Click on one and you can see its name, picture and distance.
You can find the completed sample project for this tutorial here.
Over the course of this two-part tutorial you built an app that:
There are more things for you to implement on your own:
NSFetchedResultsController
and the existing RunDetailsViewController
make this a snap!MKAnnotationView
callout.Thanks for reading. As always, I look forward to your comments and questions! :]
The post How To Make an App Like Runkeeper: Part 2 appeared first on Ray Wenderlich.
Update: We have plenty of applicants at this point, so are no longer accepting applications. Thanks to everyone who applied!
Have you ever listened to a podcast, and thought “I wish I had a podcast?”
Well, now you have a chance – we are still looking for 1 more co-host for our official podcast!
Being a co-host of our podcast brings a lot of unique benefits:
As I mentioned, we already have selected one co-host, who is looking to talk to people who are interested in the second and final spot.
If you’re interested in applying, please email me the answers to these questions:
Thanks, and I am looking forward to hearing from some of you! :]
The post Looking For 1 More Podcast Co-Host appeared first on Ray Wenderlich.
The post Screencast: Scripting in Swift: Writing a Script appeared first on Ray Wenderlich.
You'll use the multicast closure delegate pattern from the previous video to create an auto re-login authentication client in this video.
The post Video Tutorial: iOS Design Patterns Part 6: Auto Re-Login Authentication appeared first on Ray Wenderlich.
MapKit is a really useful API available on iOS devices that makes it easy to display maps, plot locations, and even draw routes and other shapes on top.
This update uses public artworks data from Honolulu, where I was born and raised. It’s no longer my hometown, but the names and places bring back memories. If you’re not lucky enough to live there, I hope you’ll enjoy imagining yourself being there!
In this tutorial, you’ll make an app that zooms into a location in Honolulu, and plot one of the artworks on the map. You’ll implement the pin’s callout detail button to launch the Maps app, with driving/walking directions to the artwork. Your app will then parse a JSON file from a Honolulu data portal, to extract the public artwork objects, and plot them on the map.
In the process, you’ll learn how to add a MapKit map to your app, zoom to a particular location, parse government data that uses the Socrata Framework, create custom map annotations, and more!
This tutorial assumes some familiarity with Swift and iOS programming. If you are a complete beginner, check out some of the other tutorials on this site.
Now let’s get mapping!
Start by downloading the starter project, which contains the JSON file and some image assets, but no maps yet!
Open Main.storyboard. In the File Inspector, check the box for Use Safe Area Layout Guides. This stops you setting constraints relative to the deprecated layout guides, and that stops “deprecated” warnings.
In the Document Outline, select Safe Area, to see its top edge is slightly lower than the view’s top edge. From the Object library, drag a MapKit View into the upper corner of the scene, aligning its top edge with the dashed blue line below the view’s top edge, then drag its lower right corner to meet the view’s lower right corner. Use the Add New Constraints auto layout menu (the TIE fighter icon) to pin the map view: uncheck Constrain to margins, then set all the neighbor values to 0, and click Add 4 constraints:
Note: Normally, you don’t have to manually stretch the map view onto the scene — simply use the Add New Constraints menu to pin its edges — but this isn’t yet working in Xcode 9 beta.
Next, add this line to ViewController.swift, just below the import UIKit
statement:
import MapKit
Build and run your project, and you’ll have a fully zoomable and pannable map showing the continent of your current location, using Apple Maps!
So far so good, eh? But you don’t want to start the map looking at the entire world – you want to zoom into a particular area!
To control the map view, you must create an outlet for it in ViewController.swift.
In the storyboard, open the assistant editor: it should display ViewController.swift.
To create the outlet, click the Map View in Main.storyboard, and control-drag from it into the space just inside the ViewController
class definition: Xcode should prompt you to Insert Outlet or Outlet Collection. Release the drag and, in the pop-up window, name the outlet mapView
:
Xcode adds a mapView
property to the ViewController
class: you’ll use this to control what the map view displays.
Switch back to the standard editor and, in ViewController.swift, find viewDidLoad()
, and add the following to the end of the method:
// set initial location in Honolulu
let initialLocation = CLLocation(latitude: 21.282778, longitude: -157.829444)
You’ll use this to set the starting coordinates of the map view to a point in Honolulu.
When telling the map what to display, giving a latitude and longitude is enough to center the map, but you must also specify the rectangular region to display, to get a correct zoom level.
Add the following constant and helper method to the class:
let regionRadius: CLLocationDistance = 1000
func centerMapOnLocation(location: CLLocation) {
let coordinateRegion = MKCoordinateRegionMakeWithDistance(location.coordinate,
regionRadius, regionRadius)
mapView.setRegion(coordinateRegion, animated: true)
}
The location
argument is the center point. The region will have north-south and east-west spans based on a distance of regionRadius
. You set this to 1000 meters: a little more than half a mile, which works well for plotting the public artwork data in the JSON file.
setRegion(_:animated:)
tells mapView
to display the region. The map view automatically transitions the current view to the desired region with a neat zoom animation, with no extra code required!
Back in viewDidLoad()
, add the following line to the end of the method:
centerMapOnLocation(location: initialLocation)
You’re calling the helper method to zoom into initialLocation
on startup.
Build and run the app, and you’ll find yourself in the heart of Waikiki: aloha! :]
The next step is to plot interesting data around the current location. But where in the world can we get such stuff?
Well, it depends on your current location. Honolulu, like many cities, has an Open Data Portal to improve public access to government data. Like many cities, Honolulu’s data portal is “Powered by Socrata“, an open data framework that provides a rich set of developer tools for accessing Socrata-based data. After you finish this tutorial, maybe look around to see if a nearby city has an alternate dataset you can use?
For this tutorial, you’ll be using the Honolulu Public Art dataset. To keep things simple, I’ve already downloaded this data from the portal, and included it in the starter project.
To get a feeling for the items in this dataset, open PublicArt.json in the Xcode editor and scroll down to line 1180 (or use ⌘ + L for Jump to Line), which begins with "data"
followed by an array of arrays – one array for each artwork. For this tutorial, you’ll use only a few properties from each array: the artwork’s location name, discipline, title, latitude and longitude. For example, for the first data item:
Later in this tutorial, you’ll parse this dataset to create an array of artworks but first, to jump straight into the MapKit fun, you’ll just plot one of the artworks on the map.
In PublicArt.json, jump or scroll to item 55 at line 1233: it’s a bronze statue of King David Kalakaua in Waikiki Gateway Park – ah, can you hear the waves breaking on the beach?
The properties for this item are:
To show this on the map view, you must create a map annotation. Map annotations are small pieces of information tied to a particular location, and are often represented in Apple’s Maps app as little pins.
To create your own annotations, you create a class that conforms to the MKAnnotation
protocol, add the annotation to the map, and inform the map how the annotation should be displayed.
First, create an Artwork
class in a new Swift file: File\New\File, choose iOS\Source\Swift File, and click Next. Set the Save As field to Artwork.swift and click Create.
Open Artwork.swift in the editor and add the following, below import Foundation
:
import MapKit
class Artwork: NSObject, MKAnnotation {
let title: String?
let locationName: String
let discipline: String
let coordinate: CLLocationCoordinate2D
init(title: String, locationName: String, discipline: String, coordinate: CLLocationCoordinate2D) {
self.title = title
self.locationName = locationName
self.discipline = discipline
self.coordinate = coordinate
super.init()
}
var subtitle: String? {
return locationName
}
}
To adopt the MKAnnotation
protocol, Artwork
must subclass NSObject
, because MKAnnotation
is an NSObjectProtocol
.
The MKAnnotation
protocol requires the coordinate
property. If you want your annotation view to display a title and subtitle when the user taps a pin, your class also needs properties named title
and subtitle
.
It’s perfectly sensible for the Artwork
class to have stored properties named title
and coordinate
, but none of the PublicArt.json properties maps naturally to the idea of “subtitle”. To conform to the MKAnnotation
protocol, you make subtitle
a computed property that returns locationName
.
OK, so the title
, locationName
and coordinate
properties will be used for the MKAnnotation
object, but what’s the discipline
property for? You’ll find out later in this tutorial! ;]
Next, you’ll add an Artwork
object to the map view, for every artwork you want to plot. For now, you’re adding only one artwork, so switch to ViewController.swift and add the following lines to the end of viewDidLoad()
:
// show artwork on map
let artwork = Artwork(title: "King David Kalakaua",
locationName: "Waikiki Gateway Park",
discipline: "Sculpture",
coordinate: CLLocationCoordinate2D(latitude: 21.283921, longitude: -157.831661))
mapView.addAnnotation(artwork)
Here, you create a new Artwork
object, and add it as an annotation to the map view. The MKMapView
class also has an addAnnotations:
(plural) method, which you’ll use later in this tutorial, when you have an array of annotations to add to the map view.
Build and run your project, and now you should see where King David Kalakaua’s statue is, at the gateway to Waikiki!
The default annotation marker view shows the location, with the title below the marker. Select the marker: it grows, and now shows the subtitle, as well:
Well, that’s ok, but you’re used to pins that show a callout — a little bubble — when the user taps the marker. For that, you must configure the annotation view, and that’s the next step.
One way to configure the annotation view is to implement the map view’s mapView(_:viewFor:)
delegate method. Your job in this delegate method is to return an instance of MKAnnotationView
, to present as a visual indicator of the annotation.
In this case, ViewController
will be the delegate for the map view. To avoid clutter and improve readability, you’ll create an extension of the ViewController
class.
Add the following at the bottom of ViewController.swift:
extension ViewController: MKMapViewDelegate {
// 1
func mapView(_ mapView: MKMapView, viewFor annotation: MKAnnotation) -> MKAnnotationView? {
// 2
guard let annotation = annotation as? Artwork else { return nil }
// 3
let identifier = "marker"
var view: MKMarkerAnnotationView
// 4
if let dequeuedView = mapView.dequeueReusableAnnotationView(withIdentifier: identifier)
as? MKMarkerAnnotationView {
dequeuedView.annotation = annotation
view = dequeuedView
} else {
// 5
view = MKMarkerAnnotationView(annotation: annotation, reuseIdentifier: identifier)
view.canShowCallout = true
view.calloutOffset = CGPoint(x: -5, y: 5)
view.rightCalloutAccessoryView = UIButton(type: .detailDisclosure)
}
return view
}
}
Here’s what you’re doing:
mapView(_:viewFor:)
gets called for every annotation you add to the map (just like tableView(_:cellForRowAt:)
when working with table views), to return the view for each annotation.Artwork
object. If it isn’t, return nil
to let the map view use its default annotation view.MKMarkerAnnotationView
. Later in this tutorial, you’ll create MKAnnotationView
objects, to display images instead of markers.tableView(_:cellForRowAt:)
, a map view reuses annotation views that are no longer visible. So you check to see if a reusable annotation view is available before creating a new one. MKMarkerAnnotationView
object, if an annotation view could not be dequeued. It uses the title and subtitle properties of your Artwork
class to determine what to show in the callout. Note: One extra thing to point out about this, suggested by Kalgar, when you dequeue a reusable annotation, you give it an identifier. If you have multiple styles of annotations, be sure to have a unique identifier for each one, otherwise you might mistakenly dequeue an identifier of a different type, and have unexpected behavior in your app. Again, it’s the same idea behind a cell identifier in tableView(_:cellForRowAt:)
.
All that’s left is setting ViewController
as the delegate of the map view. You can do this in Main.storyboard, but I prefer to do it in code, where it’s more visible. In ViewController.swift, add this line to viewDidLoad()
, before the statement that creates artwork
:
mapView.delegate = self
And that’s it! Build and run your project, and tap the marker to pop up the callout bubble:
mapView(_:viewFor:)
configures the callout to include a detail disclosure info button on the right side but tapping that button doesn’t do anything yet. You could implement it to show an alert with more info, or to open a detail view controller.
Here’s a neat third option: when the user taps the info button, your app will launch the Maps app, complete with driving/walking/transit directions to get from the simulated user location to the artwork!
To provide this great user experience, open Artwork.swift and add this import
statement, below the other two:
import Contacts
This adds the Contacts framework, which contains dictionary key constants such as CNPostalAddressStreetKey
, for when you need to set the address, city or state fields of a location.
Next, add the following helper method to the class:
// Annotation right callout accessory opens this mapItem in Maps app
func mapItem() -> MKMapItem {
let addressDict = [CNPostalAddressStreetKey: subtitle!]
let placemark = MKPlacemark(coordinate: coordinate, addressDictionary: addressDict)
let mapItem = MKMapItem(placemark: placemark)
mapItem.name = title
return mapItem
}
Here you create an MKMapItem
from an MKPlacemark
. The Maps app is able to read this MKMapItem
, and display the right thing.
Next, you have to tell MapKit what to do when the user taps the callout button. Open ViewController.swift, and add this method to the MKMapViewDelegate
extension:
func mapView(_ mapView: MKMapView, annotationView view: MKAnnotationView,
calloutAccessoryControlTapped control: UIControl) {
let location = view.annotation as! Artwork
let launchOptions = [MKLaunchOptionsDirectionsModeKey: MKLaunchOptionsDirectionsModeDriving]
location.mapItem().openInMaps(launchOptions: launchOptions)
}
When the user taps a map annotation marker, the callout shows an info button. If the user taps this info button, the mapView(_:annotationView:calloutAccessoryControlTapped:)
method is called.
In this method, you grab the Artwork
object that this tap refers to, and then launch the Maps app by creating an associated MKMapItem
, and calling openInMaps(launchOptions:)
on the map item.
Notice you’re passing a dictionary to this method. This allows you to specify a few different options; here the DirectionModeKey
is set to Driving
. This causes the Maps app to show driving directions from the user’s current location to this pin. Neat!
MKMapItem
documentation to see other launch option dictionary keys, and the openMaps(with:launchOptions:)
method that lets you pass an array of MKMapItem
objects.
Before you build and run, you should move to Honolulu – well, actually, just set your simulated location to Honolulu. In Xcode, go to Product\Scheme\Edit Scheme…, select Run from the left menu, then select the Options tab. Check Core Location: Allow Location Simulation, and select Honolulu, HI, USA as the Default Location. Then click the Close button:
Build and run the app, and you’ll see the map zoom in on Waikiki, as before. Tap on the marker, then tap the info button in the callout, and watch it launch the Maps app to show the statue’s location, with driving directions to it:
This calls for a celebration with your favorite tropical drink!
Now that you know how to show one artwork on the map, and how to launch the Maps app from the pin’s callout info button, it’s time to parse the dataset into an array of Artwork
objects. Then you’ll add them as annotations to the map view, to display all artworks located in the current map region.
Add this failable initializer to Artwork.swift, below the initializer:
init?(json: [Any]) {
// 1
self.title = json[16] as? String ?? "No Title"
self.locationName = json[12] as! String
self.discipline = json[15] as! String
// 2
if let latitude = Double(json[18] as! String),
let longitude = Double(json[19] as! String) {
self.coordinate = CLLocationCoordinate2D(latitude: latitude, longitude: longitude)
} else {
self.coordinate = CLLocationCoordinate2D()
}
}
Here’s what you’re doing:
json
argument is one of the arrays that represent an artwork – an array of Any
objects. If you count through an array’s elements, you’ll see that the title
, locationName
etc. are at the indexes specified in this method. The title
field for some of the artworks is null
, so you provide a default value for the title
value.json
array are strings: if you can create Double
objects from them, you create a CLLocationCoordinate2D
.In other words, this initializer converts an array like this:
[ 55, "8492E480-43E9-4683-927F-0E82F3E1A024", 55, 1340413921, "436621", 1340413921, "436621", "{\n}", "Sean Browne", "Gift of the Oahu Kanyaku Imin Centennial Committee", "1989", "Large than life-size bronze figure of King David Kalakaua mounted on a granite pedestal. Located at Waikiki Gateway Park.", "Waikiki Gateway Park", "http://hiculturearts.pastperfect-online.com/34250images/002/199103-3.JPG", "1991.03", "Sculpture", "King David Kalakaua", "Full", "21.283921", "-157.831661", [ null, "21.283921", "-157.831661", null, false ], null ]
into an Artwork
object like the one you created before:
locationName
: “Waikiki Gateway Park”discipline
: “Sculpture”title
: “King David Kalakaua”coordinate
with latitude: 21.283921 longitude: -157.831661To use this initializer, open ViewController.swift, and add the following property to the class – an array to hold the Artwork
objects from the JSON file:
var artworks: [Artwork] = []
Next, add the following helper method to the class:
func loadInitialData() {
// 1
guard let fileName = Bundle.main.path(forResource: "PublicArt", ofType: "json")
else { return }
let optionalData = try? Data(contentsOf: URL(fileURLWithPath: fileName))
guard
let data = optionalData,
// 2
let json = try? JSONSerialization.jsonObject(with: data),
// 3
let dictionary = json as? [String: Any],
// 4
let works = dictionary["data"] as? [[Any]]
else { return }
// 5
let validWorks = works.flatMap { Artwork(json: $0) }
artworks.append(contentsOf: validWorks)
}
Here’s what you’re doing in this code:
Data
object.JSONSerialization
to obtain a JSON object.String
keys and Any
values."data"
.Artwork
class, and append the resulting validWorks
to the artworks
array.You now have an array of all the public artworks in the dataset, which you’ll add to the map.
Still in ViewController.swift, add the following code at the end of viewDidLoad()
:
loadInitialData()
mapView.addAnnotations(artworks)
Note: Be sure to use the plural addAnnotations
, not the singular addAnnotation
!
Comment out or delete the lines that create the single “King David Kalakaua” map annotation – you don’t need them, now that loadInitialData
creates the artworks
array:
// let artwork = Artwork(title: "King David Kalakaua",
// locationName: "Waikiki Gateway Park",
// discipline: "Sculpture",
// coordinate: CLLocationCoordinate2D(latitude: 21.283921, longitude: -157.831661))
// mapView.addAnnotation(artwork)
Build and run your app and check out all the markers!
Move the map around to see other markers appear. For example, north of your initial location, above Highway 1, is Honolulu’s Pioneer Artesian Well:
Note: Northwest of the marker is Punahou School, which claims a former US President as an alumnus! And West of the marker is the hospital where he was born. ;]
Tap a marker to open its callout bubble, then tap its info button to launch the Maps app – yes, everything you did with the King Kalakaua statue works with all these new artworks!
Note: Thanks to Dave Mark for pointing out that Apple recommends adding all the annotations right away, whether or not they’re visible in the map region – when you move the map, it automatically displays the visible annotations.
And that’s it! You’ve built an app that parses a JSON file into an array of artworks, then displays them as annotation markers, with a callout info button that launches the Maps app – celebrate with a hula dance around your desk! :]
But wait, there are a few bits of bling that I saved for last…
Remember the discipline
property in the Artwork
class? Its values are things like “Sculpture” and “Mural” – in fact, the most numerous disciplines are Sculpture, Plaque, Mural and Monument. It’s easy to color-code the markers so these disciplines have markers of different colors, with green markers for all the other disciplines.
In Artwork.swift, add this property:
// markerTintColor for disciplines: Sculpture, Plaque, Mural, Monument, other
var markerTintColor: UIColor {
switch discipline {
case "Monument":
return .red
case "Mural":
return .cyan
case "Plaque":
return .blue
case "Sculpture":
return .purple
default:
return .green
}
}
Now, you could keep adding code to mapView(_:viewFor:)
, but that would clutter up the view controller. There’s a more elegant way, similar to what you can do for table view cells. Create a new Swift file named ArtworkViews.swift, and add this code, below the import
statement:
import MapKit
class ArtworkMarkerView: MKMarkerAnnotationView {
override var annotation: MKAnnotation? {
willSet {
// 1
guard let artwork = newValue as? Artwork else { return }
canShowCallout = true
calloutOffset = CGPoint(x: -5, y: 5)
rightCalloutAccessoryView = UIButton(type: .detailDisclosure)
// 2
markerTintColor = artwork.markerTintColor
glyphText = String(artwork.discipline.first!)
}
}
}
Soon, you’ll register this class as a reusable annotation view for Artwork
annotations. The system will pass it an annotation as newValue
, so here’s what you’re doing:
mapView(_:viewFor:)
, configuring the callout.Now switch to ViewController.swift, and add this line to viewDidLoad()
, just before calling loadInitialData()
:
mapView.register(ArtworkMarkerView.self,
forAnnotationViewWithReuseIdentifier: MKMapViewDefaultAnnotationViewReuseIdentifier)
Here, you register your new class with the map view’s default reuse identifier. For an app with more annotation types, you would register classes with custom identifiers.
Scroll down to the extension, and comment out the mapView(_:viewFor:)
method.
Build and run your app, then move the map around, to see the different colored and labeled markers:
In this section of the map, there are actually a lot more artworks than the map view shows: it reduces clutter by clustering markers that are too close together. In the next section, you’ll see all the annotations.
But first, set the glyph’s image instead of its text: in ArtworkMarkerView
, comment out the glyphText
line, and add these lines:
if let imageName = artwork.imageName {
glyphImage = UIImage(named: imageName)
}
These images from icons8.com are already in Images.xcassets.
Build and run your app to see different colored markers with images:
And that’s a segue to another customization option, and your next task: replace the markers with images!
In ArtworkViews.swift, add the following class:
class ArtworkView: MKAnnotationView {
override var annotation: MKAnnotation? {
willSet {
guard let artwork = newValue as? Artwork else {return}
canShowCallout = true
calloutOffset = CGPoint(x: -5, y: 5)
rightCalloutAccessoryView = UIButton(type: .detailDisclosure)
if let imageName = artwork.imageName {
image = UIImage(named: imageName)
}
}
}
}
Now, you’re using a plain old MKAnnotationView
instead of an MKMarkerAnnotationView
, and the view has an image
property.
Back in ViewController.swift, in viewDidLoad(), register this new class, instead of ArtworkMarkerView
:
mapView.register(ArtworkView.self,
forAnnotationViewWithReuseIdentifier: MKMapViewDefaultAnnotationViewReuseIdentifier)
Build and run your app to see the sculptures and flags:
Now, you don’t see the titles, but the map view shows all the annotations.
The right callout accessory is an info button, but tapping it opens the Maps app, so now you’ll change the button to show the Maps icon.
Find this line in ArtworkView
:
rightCalloutAccessoryView = UIButton(type: .detailDisclosure)
Replace this line with the following code:
let mapsButton = UIButton(frame: CGRect(origin: CGPoint.zero,
size: CGSize(width: 30, height: 30)))
mapsButton.setBackgroundImage(UIImage(named: "Maps-icon"), for: UIControlState())
rightCalloutAccessoryView = mapsButton
Here, you create a UIButton
, set its background image to the Maps icon from iconarchive.com in Images.xcassets, then set the view’s right callout accessory to this button.
Build and run your app, then tap a view to see the new Maps button:
The final customization is the detail callout accessory: it’s a single line, which is enough for the short location text, but what if you want to show a lot of text?
In Artwork.swift, locate this line in init(json:)
:
self.locationName = json[12] as! String
Replace it with this line:
self.locationName = json[11] as! String
Here you’re opting for the long description of the artwork, which doesn’t fit in the default one-line detail callout accessory. Now you need a multi-line label: add the following code to ArtworkView
:
let detailLabel = UILabel()
detailLabel.numberOfLines = 0
detailLabel.font = detailLabel.font.withSize(12)
detailLabel.text = artwork.subtitle
detailCalloutAccessoryView = detailLabel
Build and run your app, then tap a view to see the long description:
This app doesn’t need to ask the user for authorization to access their location, but it’s something you might want to include in your other MapKit-based apps.
In ViewController.swift, add the following lines:
let locationManager = CLLocationManager()
func checkLocationAuthorizationStatus() {
if CLLocationManager.authorizationStatus() == .authorizedWhenInUse {
mapView.showsUserLocation = true
} else {
locationManager.requestWhenInUseAuthorization()
}
}
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
checkLocationAuthorizationStatus()
}
Here, you create a CLLocationManager
to keep track of your app’s authorization status for accessing the user’s location. In checkLocationAuthorizationStatus()
, you “tick” the map view’s Shows-User-Location checkbox if your app is authorized; otherwise, you tell locationManager
to request authorization from the user.
locationManager
can make two kinds of authorization requests: requestWhenInUseAuthorization
or requestAlwaysAuthorization
. The first lets your app use location services while it is in the foreground; the second authorizes your app whenever it is running. Apple’s documentation discourages the use of “Always”:
Requesting “Always” authorization is discouraged because of the potential negative impacts to user privacy. You should request this level of authorization only when doing so offers a genuine benefit to the user.
There’s just one more authorization-related task you need to do – if you don’t, your app won’t crash but the locationManager
’s request won’t appear. To get the request to work, you must provide a message explaining to the user why your app wants to access their location.
In Info.plist, open Information Property List. Hover your cursor over the up-down arrows, or click on any item in the list, to display the + and – symbols, then click the + symbol to create a new item. Scroll down to select Privacy – Location When In Use Usage Description, then set its Value to something like To show you cool things nearby:
Build and run. You’ll see the permission request appear on launch:
With a usage description like that, who wouldn’t allow access? ;]
NSLocationAlwaysAndWhenInUseUsageDescription
and NSLocationWhenInUseUsageDescription
keys…”. In Xcode 9 beta, I had to use the NSLocationAlwaysAndWhenInUseUsageDescription
key; Xcode wouldn’t select the matching Privacy key.
Below, the location manager requested “Always”:
Here is the final project with all of the code you’ve developed in this tutorial.
Now you know the basics of using MapKit, but there’s a lot more you can add: geocoding, geofencing, custom map overlays, and more. A great place to find additional information is Apple’s Location and Maps Programming Guide.
Also look at WWDC 2017 Session 237 What’s New in MapKit, to find more cool features they added in iOS 11.
We also have a terrific video course MapKit & Core Location that covers a lot of awesome topics.
Our 2-part How to Make an App Like RunKeeper tutorial shows you how to create your own run-tracker, using Core Location to map your run in real time.
If you want to decorate or customize the map provided by Apple with your own annotations and images, look at MapKit Tutorial: Overlay Views.
If you have any questions as you use MapKit in your apps, or tips for other MapKit users, please join in the forum discussion below!
The post MapKit Tutorial: Getting Started appeared first on Ray Wenderlich.
In this video you'll learn about the memento pattern which allows an object's state to be saved and restored later.
The post Video Tutorial: iOS Design Patterns Part 7: Memento appeared first on Ray Wenderlich.
Update Note: This Beginning android development is now up to date with the latest version of Android Studio. Update by Eunice Obugyei. Original tutorial by Matt Luedke. Previous updates by Darryl Bayliss and Megha Bambra.
Clearly there’s a demand for Android app development since there are over two billion monthly active users around the globe. To say that it’s an exciting platform and space to make apps for is an understatement.
There aren’t any prerequisites for this tutorial, other than a willing mind and a Mac — you can certainly develop for Android on PC, but these instructions are tooled for Mac-based developers.
You’ll learn how to set up all the tools needed to start you on your way to becoming an Android developer. Here’s what you’ll do in this beginning Android development tutorial:
One of the most important parts of getting started with any new platform is setting up your environment, and this is no different with Android.
It’s important to take your time and follow each step methodically. Even if you follow the steps perfectly, you may have to troubleshoot a small issue or few. Your system configuration or product versions can make for unexpected results.
With all of this in mind, let’s quickly check that you have the Java Development Kit (JDK) installed. To check, you’ll use trusty old Terminal.
Note: You’ll learn the essential steps for this tutorial in the next few paragraphs, but if you’d like to deepen your knowledge of the Terminal, you’ll find a good introductory tutorial about it in this blog from teamtreehouse.com.
In a nutshell, using Terminal is kind of like looking under your car’s hood. It’s how you really get to know the machine face-to-face, without any complex graphical interface to interfere.
You can find the Terminal app quite easily on a Mac: open Launchpad and type terminal into the search at the top of the screen and select Terminal when it shows up.
Once you have Terminal open, type in java -version
. You should see some output that mentions a version number, like below.
If that’s not what you see, then you don’t have the JDK installed. Terminal might tell you -bash: java: command not found
, or it could say No Java runtime present, requesting install.
and trigger a pop up that will lead you down the yellow brick road…to Oracle’s website.
You can either click More Info… or head over to Oracle to download the JDK from Oracle.
Install the JDK if needed, and once you’re done, head over to the Android Studio page and click the Download Android Studio button.
Google constantly updates this page, so the version you see may very well be newer than the screenshot above. Once you click the Download Android Studio button, you’ll see a request to agree to the terms and conditions.
After reading these carefully (everybody takes the time to fully read these, right?) accept and click the blue button underneath titled Download Android Studio For Mac. Once the download is complete, you can install Android Studio just like how you install any other program.
The download link will redirect to a page that contains installation instructions for OS X, Windows and Linux Operating Systems. If the instructions don’t appear, then you can view them here.
Once installation wraps itself up, go ahead and launch Android Studio!
The setup wizard will greet you the first time it loads.
Click Next to move to the Install Type screen. This whole process will probably take several minutes.
Check the box for Standard and click Next.
On the Verify Settings window, you’ll have an opportunity to confirm your setup. Click Finish to start downloading the SDK components.
When the download is complete, click Finish.
After a few minutes, you’ll have the welcome screen, which serves as your gateway to building all things Android.
Even though you just downloaded Android Studio, it’s possible that it’s not the latest version. Select Configure/Check for Update at the bottom of the welcome screen to check whether any updates are available.
If an update is available, a window like the screenshot below will appear. Click the Update and Restart button and let it do its thing.
Each version of Android has its own SDK (Software Development Kit) that enables you to create applications for the Android platform. Since you just went through the setup wizard, you’ll already have the latest version of the SDK available to you.
It’s useful to know how to install additional versions of the SDK to help you develop for all supported versions of Android.
SDKs allow you to create AVDs (Android Virtual Devices) according to your personal configuration for the purpose of testing your app.
From the Android Studio welcome screen, select Configure/SDK Manager.
Once it launches, you’ll see a window like the one below:
The first tab of this window, SDK Platforms, lists the Android SDK platforms available for download.
Enabling the Show Package Details option displays the individual SDK components, such as the platform itself and the sources pertaining to the API level like system image.
Take note of the checkbox next to the SDK platform; it will be pre-selected if an update is available.
By default, the SDK Manager installs the latest packages and tools. Select the SDKs as shown in the screenshot above. If you wish to install other SDKs, just select them for installation.
The SDK Tools tab lists developer tools and documentation along with the latest versions. Similar to the first tab, checking the Show Package Details will display available version of SDK tools.
Three of the selected components in this list, for example, are Android SDK Build-Tools, Android SDK Tools and Android SDK Platform-Tools. Each contains components that are designed to assist in the development of Android and work across multiple SDKs. Go with the default selection on this tab.
The SDK Update Sites tab displays the update sites for Android SDK tools and add-ons. You’re not limited to what’s listed under that tab. You can add other sites that host their own Android SDK add-ons, and then download them from those sites.
For the purpose of setting up correctly, select the options that are checked in the screenshot above. Click Apply at the bottom if it’s active. You’ll see a confirmation dialog for the chosen packages; accept and continue. Click OK to close the window.
The confirmation dialog will disappear and a license agreement will popup.
Read through it, select the Accept checkbox and click Next. The SDK Manager will download and install the selected items. Once it’s done, click Finish. You’ll be directed back to the SDK Manager window where clicking OK will take you back to the Welcome to Android Studio screen.
This is where the fun begins!
Android Studio has a nice little step-by-step tool to help you create your project. Click Start a new Android Studio Project from the Welcome to Android Studio screen:
Note: If you currently have an Android Studio project open and can’t see the welcome screen, select File\New Project from the menu to create a new project.
Android Studio will present you with a project creation screen:
Enter OMG Android in Application name as shown above. Feel free to put your own name in the Company domain text field. As you type, you’ll notice the Package name automatically changes to create a reverse domain style name per your entries.
The Package name is used to uniquely identify your app so that any work done on a device is always properly attributed to the source, thus preventing confusion between apps.
You can set the Project location to any location on your hard drive — keep the default if you don’t have a preference. Click Next at the bottom of the window.
The next screen is the Target Android Devices window. This is where you select device types and operating systems to target.
The Minimum SDK drop-down menu sets the minimum version of Android requires to run your app. The newer the SDK, the more features you’ll have at your disposal; however, newer SDKs support fewer devices.
Selecting this value is simply a matter of balancing the capabilities you want and the devices you want to support. This is where developing for Android can get a little tricky.
If you really want to get into the details of what Minimum SDK version is best for your App, let Android Studio help you out.
As you change the Minimum SDK in the drop down menu, the percentage in the text underneath reflects what percentage of devices currently run that version of Android.
Click Help me choose underneath the drop down list to learn more about each SDK’s set of features.
For more on API versions statistics, check out the Android Dashboards, which are updated periodically.
For now, you just want an App that works on an Android Phone, and that is what you’ll see by default, alongside the default Minimum SDK. For this project, select SDK of API 16: Android 4.1 (Jelly Bean) and click Next.
The next step is to select your default Activity.
Think of an Activity as a window within your app that displays content with which the user can interact. An Activity can take up the entire screen or it could be a simple pop-up.
Your options on this particular template range from a Basic Activity, which is a blank screen with an Action Bar and a Floating Button right up to an Activity with an embedded MapView.
You’ll make a lot of activities as you develop more apps, so get to know them and know them well.
Select the Basic Activity and click Next.
To speed this part up a little bit you’ll use the pre-populated default values, but what is actually done with these values?
.java
class will be created and will use the contents of this text field to give the class a name, which will ultimately be the name you use to refer to this Activity in your code.Click Finish.
Android Studio takes this as a cue to go do a bunch of behind-the-scenes operations and create your project. As it shoots out descriptions of what it’s doing, you may notice it says something like this:
Here, you see your project name, which is familiar. But then there is this Gradle word as well.
The benefit of having a modern IDE like Android Studio is that it handles a lot for you. But, as you’re learning how to use the software, it’s good to have a general sense of what it does for you.
Gradle is a build tool that’s easy to use, and if you investigate further, you’ll find it contains advanced options. It takes your Java code, XML layouts and the latest Android build tools to create the app package file, also known as an APK (Android Package Kit) file.
You can customize your configurations to have development or production versions of the app that behave differently. You can also add dependencies for third-party libraries.
After a brief moment, Android Studio will finish building your project. The project is pretty empty, of course, but it has everything it needs set up so that it can be launched on an Android device or emulator.
Let’s take a brief look at the different parts of the project.
Open res/layout/activity_main.xml and click Text at the bottom of the screen.
This shows you the xml code for your main layout. It also shows a preview of how it will look like on a device on the right side of the screen.
Note: If the Preview screen is not shown by default for you, open it by selecting View\Tool Windows\Preview from the menu
You’ve got Android Studio and you’ve created an app. So how do you run it?
Android Studio comes with the ability to set up a software-based Android device on your computer and run apps on it, browse websites, debug and everything you would expect from a simulator. This capability is known as the Android Emulator.
You can set up multiple emulators and set the screen size and platform version for each to whatever you like. Good thing, too. You’d need a whole room dedicated to storing devices for testing because there are so many out there — okay, maybe that’s an exaggeration, but you get the idea. :]
If you ran through the setup wizard earlier using the standard installation, then you’ll already have an emulator set up and ready for you.
Up until recently, your computer would have to emulate everything an Android device would try to do, right down to its hardware, which runs an ARM-based processor. Most computers make use of x86-based processors, meaning your computer has to translate each instruction to one that an ARM-based processor would understand and this takes a significant amount of time. To reduce this overhead, Android Studio has recently adopted the HAXM driver which is able to speed things up a bit.
You still have the option to create an emulator that is as close to an actual device as you can, but be aware that the initial load times can drag a bit and have put off many an Android developer from using emulators at all.
With all that being said…let’s set up an emulator anyway, because you do need to know how!
Click the AVD Manager. It’s a button near the right side of the toolbar that shows an Android popping its head up next to a device with a purple display:
The AVD Manager will open to a screen with an option to create a new device:
Click the Create Virtual Device… button to start the process of creating a new virtual device.
The first step is to select the type of device. The Category section, on the left side of the screen, shows a list of the different device types you can emulate(TV, Wear, Phone, Tablet). Make sure the Phone option is selected. In the middle of the screen, there is a list of specific devices with their screen size, resolution and density. Take a moment to explore them.
What you need now is just to emulate a phone-sized device, but if you wanted to emulate an Android Wear watch or an Android TV then you have options to do so here.
Select Pixel in the list of devices available to you from the phone category and click Next.
On the next screen, you have to select the version of Android the virtual device will run.
For this tutorial, select Nougat and make sure the one selected has the value x86 in the ABI column so the emulator runs as fast as possible on your x86 computer.
Note: If that version is not already downloaded, click on the Download link beside the release name to download before you continue.
Click Next once you’re done to advance to the final screen.
The last screen lets you confirm your choices and gives options to configure some other properties such as device name and startup orientation. Clicking the Show Advanced Settings button, shows you extra configurations you can change such as Camera, Network and Memory settings.
Use the defaults and click Finish.
Close the AVD Manager to go back to Android Studio’s main view. Now that you’ve configured everything, click the Run button.
A new window will appear, asking you to choose the device you wish to test your App on. You currently have no devices running, so select the Pixel you just created and click OK.
Note: If you get an error that says This AVD’s configuration is missing a kernel file!!, check to make sure that you don’t have the ANDROID_SDK_ROOT environment variable set from a previous installation of the Android SDK. See this thread on Stack Overflow for more troubleshooting tips.
In the event that it doesn’t work the first time or takes several minutes for the emulator to fire up correctly, don’t worry, that’s not entirely unexpected. Stick with it. Once it’s ready, you should see something like this:
Whoa. You just made your first Android app.
As you may have noticed, there’s a panel on the right side of the emulator. That is the emulator toolbar. The toolbar lets you perform various tasks such as Taking screenshots, Screen rotation, Volume control and perform extended functionalities such as simulating device location, phone calls, message sending, finger print etc.
To access the extended functionalities, click the More (…) icon at the bottom of the toolbar.
If you have an Android device you want to run your app on, follow the animated GIF on the right. It demonstrates how to enable developer mode on your device.
Here are the step-by-step instructions to enable Developer Mode on an Android device:
Now that you’ve configured your device, click the Run button.
Just like before, you’ll get a prompt from the Select Deployment Target dialog. The device should now appear in this dialog. Select it and click OK.
Ahh…isn’t it rewarding to see the app on your device? Go ahead and show it off to your friends. :]
Note: If the app is already running, you might not get the prompt. This is because of a new functionality in Android Studio know as Instant Run. We’ll talk about it in the next section of this tutorial. Close the emulator, go back and click the Run button again.
From version 2.0 of Android Studio, a new functionality was introduced called Instant Run. Instant Run allows you to push updates (code and resources) to a running app on a device or emulator without performing a full reinstall. By doing this, you are able to view your changes in a shorter time.
There are three kinds of changes you can make to your code: a hot swap, warm swap, or cold swap. Instant Run pushes updates by performing one of the following, depending on the kind of change you made:
Go ahead and try out Instant Run.
To enable Instant Run, select Android Studio \ Preferences \ Build, Execution, Deployment \ Instant Run. Ensure that Enable Instant Run to hot swap code/resource changes on deploy is checked and that Restart activity on code changes is unchecked.
If your app is not yet running, launch it by clicking the Run button, and wait for it to launch.
When the app is running, the Apply Changes button on the right side of the Run becomes enabled.
As your app is now running, clicking on the floating button, shows the message, Replace with your own action at the bottom of the screen.
Change that message to test out Instant Run.
Open MainActivity, in the onCreate() method, replace the text: Replace with your own action with Hello Instant Run.
Then click the Apply Changes. Now when you click the button, you will see the new message. This is an example of a hot swap.
Instant Run helps you code faster by significantly reducing the time it takes to update your app with code and resource changes.
During your Android app-making journey, you’ll find times where you need to import existing projects. The steps below will guide you through how to import a project:
It’s build and run time! Click the Run button in the toolbar and select either the emulator or device you’ve already set up.
You’ve covered a lot of ground in this beginning Android development tutorial: from downloading and installing Android Studio, through creating your first “Hello World!” app, to deploying it on a physical device!
Keep reading for the next part of the series, where you’ll take a tour of Android Studio.
In the meantime, you can follow Android — like any language or framework, Android’s development community is a strong asset and supplier of endless reference. It’s never too soon or too late to start checking out Google’s I/O conference, Android Developers blog or Android Developer videos.
The Android robot is reproduced or modified from work created and shared by Google and used according to terms described in the Creative Commons 3.0 Attribution License.
I hope you enjoyed this beginning Android development tutorial — you’ve successfully installed Android Studio and are now ready to take on the world of Android development. If you have any questions or comments, please join the discussion in the comments below.
The post Beginning Android Development Part One: Installing Android Studio appeared first on Ray Wenderlich.
In this video you'll learn about "composition over inheritance," a design principle used by most design patterns.
The post Video Tutorial: iOS Design Patterns Part 8: Composition Over Inheritance appeared first on Ray Wenderlich.
This screencast will first show you how you can manage your own code via building modules, before moving on to see one way you can use 3rd-party dependencies within your scripts.
The post Screencast: Scripting in Swift: Managing Dependencies appeared first on Ray Wenderlich.
Learn how to DRY out your storyboard UIs using container views. This is in preparation for the next video where you'll implement the visitor pattern.
The post Video Tutorial: iOS Design Patterns Part 9: Container Views appeared first on Ray Wenderlich.
Well WWDC has come and gone. I’m sure all of you are heads down watching videos, reading change logs, and implementing all the new APIs in your apps.
I’m honored you’re taking a quick break to see what the community has been building. Every month, readers like you release great apps built with a little help from our tutorials, books, and videos. I want to highlight a few today to give just a peek at what our fantastic readers are creating.
This month we have:
Keep reading to see the latest apps released by raywenderlich.com readers like you.
Xcode has gotten better over the years of managing multiple projects. But it can still leave some things to be desired and starts to fall apart after just 5 or 6 projects.
XcLauncher is a lightweight Mac app here to help. XcLauncher lives in your menu bar and offers you quick shortcuts for your Xcode projects. These shortcuts are customizable by name and order. You can also group projects into folders for quick organization and access. It works with projects, workspaces, and even playgrounds. You can also easily open up your recent files as well.
Another aspect of development for many of us is multiple versions of Xcode. Between betas and client projects stuck on older versions, many of us have 3 or more Xcode versions at a time. XcLauncher also makes this easier. It can quickly list all versions of Xcode installed and you can launch the right one right from the menu bar.
If you’re a developer, I suggest giving XcLauncher a try. I set it to launch on startup. ;]
Have you ever been flying and wonder what type of plane you’re actually on? Often times the closest we get to knowing is when the safety presentation mentions the code number before suggesting we checkout the pamphlet in the back of the seat.
What am I flying on? is here to make it easier for your to answer that very question. This apps has almost every plane ever made it seems. You can easily look up any plane from any manufacturer. There is tons of info available. You can see general specifications for each plane like win span, engines, and number of seats. You can also see basic flight information like take off and landing runway lengths required or maximum cruising altitude.
This app is packed with more information and more planes than I knew to ask. Definitely give it a download and the next time you’re flying, learn a little more about the plane carrying you above the clouds.
Talene is a playground for children to try all sorts of things on the iPad while having fun and learning. Its a safe space with all kinds of different activities and games.
Talene really packs it in. It has matching games, word search, and word hunt to help with word associates. It has a full drawing suite that lets kids be creative and draw whatever they want while saving their favorites. It has activities to help with letter and number forms allowing kids to draw with their finger to fill in stencils.
It even has an alphabet piano that is a fun way to learn new words and the alphabet. Tapping on a key will say the letter aloud and a random word that begins with that letter. Its a fun way to just hear and play for younger children.
Talene has even more games and activities I haven’t covered. But it also has a full parents area where you can download more games and content. You can also check the progress of your child in each activity to see where they might need some guidance or practice.
Have you ever wondered about the altitude where you are? What about the altitude of somewhere you’re planning on visiting? Altimeter Pro can tell you just that and more.
Altimeter Pro is easy to use. Just press and hold anywhere on the map to see the altitude at that latitude and longitude. You’ll also get information on its accuracy range. And you can see live compass direction as well. If you’re moving it will even tell you the miles per hour you’re currently tracking at.
There are settings to let you change the units if you’re not into the imperial measurement system or just prefer metric. It will save the locations you check and of course you can easily just check your current location anytime.
Often times while we’re working we can be easily distracted by the many other things our Macs can do. Its so fast and easy to command tab over to another app or website between builds or to clear our head. But if you’re like me, then often times what should have been a quick break while Xcode does its thing for 30 seconds turns into a 5 minute distraction. Multiple that a few times a day and its easily a problem.
Thats where Atento comes in. Atento monitors the time you spend on every app and every website. You can go back and look at the past day or week and easily identify your time thieves. Knowing is half the battle. Once you know that you blow 45 minutes on Youtube in a day, you can be more vigilant about stopping sooner or not visiting at all.
Its not just about the apps you’re wasting time on, sometimes its just productive to get a glimpse at how you split your time between multiple tasks. For example, I spend about 30 minutes a day on email it turns out. Thats not necessarily bad since I’m normally responding a important emails, but I didn’t realize it was such a large part of my day. So in an 8 hour day, no wonder I never feel like I get 8 hours of actual work done.
How many times has this happened to you? You’re surrounded by hungry friends and family but no one knows where to eat. Ideas have been thrown around and a few places have been vetoed, but no decision has been made. And everyone continues to be hungry. Enter, Dinner Spinner!
Dinner Spinner is perfect for those families that can never decide on a place to eat. Dinner Spinner will find all the restaurants around you and let you spin randomly to pick on. But what good is just randomly picking. You can also customize the list based on how far you’re willing to drive. And best of all you can go ahead and remove those places that have been vetoed by the group. Or you can easily limit the list just to a few select choices. With Dinner Spinner you can rest easy knowing you’ll never have to choose a place for dinner again.
Simple fighting games used to be the end all for friends and siblings. One on one action, side by side, on a tv you had to sit as close as possible to. Battle Bros wants to bring that back.
Battle Bros: Rival Arena is a platform fighting game for the Apple TV. Battle Bros packs a ton of content to keep the fights interesting. There are several arenas to fight in ranging form desserts to grasslands to factories. Each stage has its own obstacles to make this more than just simple 1 on 1 combat. Obstacles range from angry turtles to saw blades and lasers. And there are various weapons and power ups to collect as well. So grab a friend and battle it out on the Apple TV with Battle Bros.
To really kick this off, the crew behind Battle Bros has shared a ton of promos with me to share with you! You can redeem these on your iPhone, then head over to the purchased section of the Apple TV app store to download your prize. First come, first serve so get them while they’re hot!
WLR4RXE7JNJJ
JLFRAYPT94TR
A64FJ3PRH74N
XX9NWH9LFAE4
PWRYYAMNWTLW
TYAWM7E7KAJW
E7N4YKFKFX77
AFRXX46RF3KR
WNYR99M9HK76
EHWYXK3FNAPH
Each month, I really enjoy seeing what our community of readers comes up with. The apps you build are the reason we keep writing tutorials. Make sure you tell me about your next one, submit here.
If you saw an app your liked, hop to the App Store and leave a review! A good review always makes a dev’s day. And make sure you tell them you’re from raywenderlich.com; this is a community of makers.
If you’ve never made an app, this is the month! Check out our free tutorials to become an iOS star. What are you waiting for – I want to see your app next month.
The post Readers’ App Reviews – June 2017 appeared first on Ray Wenderlich.
The Swift Algorithm Club is an open source project on implementing data structures and algorithms in Swift.
Every month, Vincent Ngo, Ross O’Brien and I feature a cool data structure or algorithm from the club in a tutorial on this site. If you want to learn more about algorithms and data structures, follow along with us!
In this tutorial, you’ll analyze 2 algorithms:
It’s fairly simple to illustrate just how important string searching algorithms are in the world. Press CMD + F and try to search for the letter c. You get the results almost instantly. Now imagine if that took 10 seconds to compute… you might as well retire!
The brute force method is relatively straightforward. To understand the brute force string method, consider the string "HELLO WORLD"
:
For the purposes of this tutorial, there are a few things to keep in mind.
let text = "HELLO WORLD"
text.index(of: "ELLO") // returns 1
text.index(of: "LD") // returns 9
The algorithm is fairly straightforward. For example, assume you are looking for the pattern "LO"
. You’ll begin by iterating through the source string. As soon as you reach a character that matches the first character of your lookup string, you’ll try to match the rest of the characters. Otherwise, you’ll move on through the rest of the string:
You’ll write this method as an extension
of String
. Using Xcode 9 beta 2 or later, create a new Swift playground. Delete the boilerplate code so you have a blank playground page. You’ll start by creating a stub of the implementation inside a String
extension. Write the following at the top of your playground:
extension String {
func index(of pattern: String) -> Index? {
// more to come
return nil
}
}
This purpose of this function is simple: Given a string (hereby referred to as the source string), you check to see if another string is within it (hereby referred to as the pattern). If a match can be made, it’ll return the index of the first character of the match. If this method can’t find a match, it’ll return nil
.
As of Swift 4, String
exposes the indices
property, which contains all the indexes that is used to subscript the string. You’ll use this to iterate through your source string. Update the function to the following:
func index(of pattern: String) -> Index? {
// 1
for i in indices {
// 2
var j = i
var found = true
for p in pattern.indices {
guard j != endIndex && self[j] == pattern[p] else { found = false; break }
j = index(after: j)
}
if found {
return i
}
}
return nil
}
This does exactly what you wanted:
As soon as you find a match, you’ll return the index. It’s time to test it out. Write the following at the bottom of your playground:
let text = "Hello World"
text.index(of: "lo") // returns 3
text.index(of: "ld") // returns 9
The brute force approach works, but it’s relatively inefficient. In the next section, you’ll look at how you can make use of a clever technique to optimize your algorithm.
As it turns out, you don’t need to look at every character from the source string — you can often skip ahead multiple characters. The skip-ahead algorithm is called Boyer Moore and it’s been around for some time. It is considered the benchmark for all string search algorithms.
This technique builds upon the brute force method, with 2 key differences:
Here’s what it looks like:
The Boyer Moore technique makes use of a skip table. The idea is fairly straightforward. You create a table based on the word you’d like to match. The table is responsible for holding the number of steps you may skip for a given letter of the word. Here’s a skip table for the word "HELLO"
:
You’ll use the skip table to decide how many traversals you should skip forward. You’ll consult the skip table before each traversal in the source string. To illustrate the usage, take a look at this specific example:
In this situation, you’re comparing the "H"
character in the source string. Since this doesn’t match the last character in the pattern, you’ll want to move down the source string. Before that, you would consult the skip table to see if there’s an opportunity to do some skips. In this case, "H"
is in the skip table and you’re able to perform a 4 index skip.
Back in your Swift playground, delete the implementation of index(of:)
, except for return nil
:
func index(of pattern: String) -> Index? {
return nil
}
You’ll start by dealing with the skip table. Write the following inside the String
extension:
fileprivate var skipTable: [Character: Int] {
var skipTable: [Character: Int] = [:]
for (i, c) in enumerated() {
skipTable[c] = count - i - 1
}
return skipTable
}
This will enumerate over a string and return a dictionary with it’s characters as keys and an integer representing the amount it should skip by. Verify that it works. At the bottom of your playground write the following:
let helloText = "Hello"
helloText.skipTable.forEach { print($0) }
You should see the following in the console:
(key: "H", value: 4) (key: "L", value: 1) (key: "O", value: 0) (key: "E", value: 3)
This matches the table diagram from earlier.
Another component of the Boyer Moore algorithm is backwards string matching. You’ll devise a method to handle that. This method has 3 goals:
nil
.String.Index
of the source string that matches the first letter of the pattern.Write the following beneath skipTable
:
// 1
fileprivate func match(from currentIndex: Index, with pattern: String) -> Index? {
// more to come
// 2
return match(from: index(before: currentIndex), with: "\(pattern.dropLast())")
}
This is the recursive method you’ll use to do matching against the source and pattern strings:
currentIndex
keeps track of the current character of the source string you want to match against.The behaviour of this method looks like this:
Now, it’s time to deal with the comparison logic. Update the match
method to the following:
fileprivate func match(from currentIndex: Index, with pattern: String) -> Index? {
// 1
if currentIndex < startIndex { return nil }
if currentIndex >= endIndex { return nil }
// 2
if self[currentIndex] != pattern.last { return nil }
// 3
if pattern.count == 1 && self[currentIndex] == pattern.last { return currentIndex }
return match(from: index(before: currentIndex), with: "\(pattern.dropLast())")
}
currentIndex
ever goes out of bounds, you’ll return nil
pattern
matches, then you’ll return the current index, indicating a match was made at starting at this location.
guard currentIndex >= startIndex && currentIndex < endIndex && pattern.last == self[currentIndex]
else { return nil }
if pattern.count == 1 && self[currentIndex] == pattern.first { return currentIndex }
With the skip table and matching function ready, it's time to tackle the final piece of the puzzle!
Update the index
method to the following:
func index(of pattern: String) -> Index? {
// 1
let patternLength = pattern.count
guard patternLength > 0, patternLength <= count else { return nil }
// 2
let skipTable = pattern.skipTable
let lastChar = pattern.last!
// 3
var i = index(startIndex, offsetBy: patternLength - 1)
// more to come...
return nil
}
You've set up the playing field:
String.Index
to keep track of traversals. Since you're planning on matching the strings backwards, you can have a small headstart by offsetting this index by the length of the pattern.Next, you'll define the logic for the matching and traversals. Add the following just before the return statement:
// 1
while i < endIndex {
let c = self[i]
// 2
if c == lastChar {
if let k = match(from: i, with: pattern) { return k }
i = index(after: i)
} else {
// 3
i = index(i, offsetBy: skipTable[c] ?? patternLength, limitedBy: endIndex) ?? endIndex
}
}
Here's the play by play:
endIndex
match
function. If this returns a non nil
value, it means you've found a match, so you'll return the index that matches the pattern. Otherwise, you'll move to the next index.Time to give it a whirl. Add the following at the bottom of the playground:
let sourceString = "Hello World!"
let pattern = "World"
sourceString.index(of: pattern)
You should get a 6
for the index. Woohoo, it's working!
I hope you enjoyed this tutorial on efficient string searching!
Here is a playground with the above code. You can also find the original implementation and further discussion on the repo for Brute Force String Search and Boyer Moore String Search.
This was just one of the many algorithms in the Swift Algorithm Club repository. If you're interested in more, check out the repo.
It's in your best interest to know about algorithms and data structures - they're solutions to many real-world problems, and are frequently asked as interview questions. Plus it's fun!
So stay tuned for many more tutorials from the Swift Algorithm club in the future. In the meantime, if you have any questions on implementing trees in Swift, please join the forum discussion below.
The post Swift Algorithm Club: Boyer Moore String Search Algorithm appeared first on Ray Wenderlich.
In this video you'll learn the visitor design pattern which you'll use to eliminate view controllers' code duplication logic.
The post Video Tutorial: iOS Design Patterns Part 10: Visitor appeared first on Ray Wenderlich.
In this video you'll learn the visitor design pattern which you'll use to eliminate view controllers' code duplication logic.
The post Video Tutorial: iOS Design Patterns Part 11: Conclusion appeared first on Ray Wenderlich.
Note: Updated by Luke Parham for for Xcode 9 beta / iOS 11 / Swift 4. Original post by Cesare Rocchi.
From the dawn of time, man has dreamt of better and better ways of communicating with his brethren far and wide. From carrier pigeons to radio waves, we’re forever trying to communicate more clearly and effectively.
In this modern age, one technology has emerged as an essential tool in our quest for mutual understanding: the humble network socket.
Existing somewhere in layer 4 of our modern networking infrastructure, sockets are at the core of any online communication from texting to online gaming.
You may be wondering, “Why do we need to go lower-level than URLSession
in the first place?”. If you’re not wondering that then go ahead and pretend you were…
Great question! The thing about communicating with URLSession
is that it’s based on the HTTP networking protocol. With HTTP, communication happens in a request-response style. This means that the majority of the networking code in most apps follows the same pattern:
But what about when you want the server to be able to tell your app about something? Doing so doesn’t really map to HTTP very well. Of course, you can make it work by continually pinging the server and seeing if it has updates, aka polling, or you can get a little more crafty and use a technique like long-polling, but these techniques can feel a little unnatural and each has its own pitfalls. At the end of the day, why limit yourself to this request-response paradigm if it’s not the right tool for the job?
In this streams tutorial, you’ll learn how you can drop down a level of abstraction and use sockets directly to create a real-time chatroom application.
Instead of each client having to check the server for new messages, it’ll use input and output streams that remain open for the duration of the chat session.
To begin, download the starter materials which include both the chat app and a simple server written in Go. You won’t have to worry about writing any Go code yourself, but you will need to get this server up and running in order to write a client for it.
The included server was written in Go and then compiled for you. If you’re not the kind of person who trusts a precompiled executable you found on the web, I’ve included the source code, so feel free to compile it yourself.
To run the pre-compiled server, open your terminal, navigate to the starter materials directory and enter this command, followed by your password when prompted:
sudo ./server
After putting your password in, you should see Listening on 127.0.0.1:80. Your chat server is ready to go! You can now skip to the next section.
If you want to compile the server yourself, you’ll need to install Go with Homebrew.
If you don’t have Homebrew either then you’ll have to install that first. Open the terminal, and paste in the following line:
/usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
Then, use this command to install Go:
brew install go
Once that’s finished, navigate to the directory of the starter materials and build the server with the build
command.
go build server.go
Finally, you can start your server, using the command listed at the start of this section.
Next, open the DogeChat project and build and run it to get a look at what’s already been built for you.
As shown above, DogeChat is currently set up to allow a user to enter their username and then go into a chat room. Unfortunately, the last person to work on it really wasn’t sure how to write a chat app so they wrote all the UI and basic plumbing, but left it for you to implement the actual networking layer.
To get started on the actual coding, navigate to ChatRoomViewController.swift. Here, you can see that you’ve got a view controller that is ready and able to receive strings as messages from the input bar, as well as display messages via a table view with custom cells that can be configured with Message
objects.
Since you’ve already got a ChatRoomViewController
, it only makes sense that you’d create a ChatRoom
class to take care of the heavy lifting.
Before getting started on writing a new class, I like to make a quick list of what its responsibilities will be. For this class, we’ll want it to take care of the following:
Now that you know what you want, hit ⌘+n to create a new file. Choose Cocoa Touch Class and then name it ChatRoom
.
Next, go ahead and replace what’s in that file with
import UIKit
class ChatRoom: NSObject {
//1
var inputStream: InputStream!
var outputStream: OutputStream!
//2
var username = ""
//3
let maxReadLength = 4096
}
Here, you’ve defined the ChatRoom
class and also declared the properties you’ll need in order to communicate effectively.
maxReadLength
. This variable puts a cap on how much data can be sent in any single message.Next, go over to ChatRoomViewController.swift and add a chat room property to the list of properties at the top.
let chatRoom = ChatRoom()
Now that you’ve got the basic structure of your class set up, it’s time to knock out the first thing in your checklist, opening a connection between the app and the server.
Head back over to ChatRoom.swift and below the property definitions, add the following method:
func setupNetworkCommunication() {
// 1
var readStream: Unmanaged<CFReadStream>?
var writeStream: Unmanaged<CFWriteStream>?
// 2
CFStreamCreatePairWithSocketToHost(kCFAllocatorDefault,
"localhost" as CFString,
80,
&readStream,
&writeStream)
}
Here’s what’s happening:
The function takes four arguments. The first is the type of allocator you want to use when initializing your streams. You should use kCFAllocatorDefault
whenever possible, though there are other options if you run into a situation where you need something that acts a little differently.
Next, you specify the hostname. In this case you’re just connecting to the local machine, but if you had a specific IP address for a remote server, you could also use that here.
Then, you specify that you’re connecting via port 80, which is the port we’ve set our server up to listen on.
Finally, you pass in the pointers to your read and write streams so the function can initialize them with the connected read and write streams it will create internally.
Now that you’ve got initialized streams, you can store retained references to them by adding the following lines:
inputStream = readStream!.takeRetainedValue()
outputStream = writeStream!.takeRetainedValue()
Calling takeRetainedValue()
on an unmanaged object allows you to simultaneously grab a retained reference and burn an unbalanced retain so the memory isn’t leaked later. Now you’ll be able to use the input and output streams when you need them.
Next, in order for your app to react to networking events properly, these streams need to be added to a run loop. Do so by adding these two lines to the end of setupNetworkCommunication
.
inputStream.schedule(in: .current, forMode: .commonModes)
outputStream.schedule(in: .current, forMode: .commonModes)
Finally, you’re ready to open the flood gates! To get the party started, add (again to the bottom of setupNetworkCommunication
):
inputStream.open()
outputStream.open()
And that’s all there is to it. To finish up, head over to ChatRoomViewController.swift and add the following line to the viewWillAppear(_:)
method.
chatRoom.setupNetworkCommunication()
You now have an open connection between your client app and the server running on localhost. You can go ahead and build and run if you want, but you’ll see the same thing you saw before since you haven’t actually tried to do anything with your connection.
Now that you’ve set up your connections to the server, it’s time to actually start communicating something! The first thing you’ll want to say is who exactly you think you are. Later, you’ll also want to start sending messages to people.
This brings up an important point: since you have two kinds of messages, you’ll need to think up a way to differentiate them.
One advantage of dropping down to the TCP level is that you can define your own “protocol” for deciding whether a message is valid or not. With HTTP, you need to think about all those pesky verbs like GET
, PUT
, and PATCH
. You need to construct URLs and use the appropriate headers and all kinds of stuff.
Here we just have two kinds of messages. You can send,
iam:Luke
To enter the room and inform the world of your name.
And you can say,
msg:Hey, how goes it mang?
To send a message to everyone else in the room.
This is pure and simple.
This is also blatantly insecure, so maybe don’t use it as-is at work. ;]
Now that you know what the server is expecting, you can write a method on your ChatRoom
class to allow a user to enter the chat room. The only argument it needs is their desired username.
To implement it, add the following method below the setup method you just wrote.
func joinChat(username: String) {
//1
let data = "iam:\(username)".data(using: .ascii)!
//2
self.username = username
//3
_ = data.withUnsafeBytes { outputStream.write($0, maxLength: data.count) }
}
write(_:maxLength:)
method takes a reference to an unsafe pointer to bytes as its first argument. The withUnsafeBytes(of:_:)
method provides you with a convenient way to work with an unsafe pointer version of some data within the safe confines of a closure.Now that your method is ready, head over to ChatRoomViewController.swift and add a call to join the chat at the bottom of viewWillAppear(_:)
.
chatRoom.joinChat(username: username)
Now, build and run, enter your name, and then hit enter to see…
The same thing?!
Now, hold on, I can explain. Go ahead and go to your Terminal application. Right under Listening on 127.0.0.1:80, you should see Luke has joined, or something along those lines if your name happens not to be Luke.
This is good news, but you’d definitely rather see some indication of success on the phone’s screen…
Luckily, the server takes incoming messages like the join message you just sent, and then sends them to everyone in the room, including you. As fortune would also have it, your app is already set up to show any type of incoming message as a cell in the ChatRoomViewController
‘s table of messages.
All you need to do is use the inputStream
to catch these messages, turn them into Message
objects, and pass them off and let the table do its thing.
In order to react to incoming messages, the first thing you’ll need to do is have your chat room become the input stream’s delegate. First, go to the bottom of ChatRoom.swift and add the following extension.
extension ChatRoom: StreamDelegate {
}
Now that you’ve said you conform to the StreamDelegate
protocol, you can claim to be the inputStream
‘s delegate.
Go up and add the following line to setupNetworkCommunication()
.
and add the following directly before the calls to schedule(_:forMode:)
.
inputStream.delegate = self
Next, add this implementation of stream(_:handle:)
to the extension.
func stream(_ aStream: Stream, handle eventCode: Stream.Event) {
switch eventCode {
case Stream.Event.hasBytesAvailable:
print("new message received")
case Stream.Event.endEncountered:
print("new message received")
case Stream.Event.errorOccurred:
print("error occurred")
case Stream.Event.hasSpaceAvailable:
print("has space available")
default:
print("some other event...")
break
}
}
Here, you’ve really just set yourself up to do something with the incoming events that can occur in relation to a Stream
. The one you’re really interested in is Stream.Event.hasBytesAvailable
since that means there’s an incoming message waiting to be read.
Next, you’ll write a method to handle these incoming messages. Below this function, add:
private func readAvailableBytes(stream: InputStream) {
//1
let buffer = UnsafeMutablePointer<UInt8>.allocate(capacity: maxReadLength)
//2
while stream.hasBytesAvailable {
//3
let numberOfBytesRead = inputStream.read(buffer, maxLength: maxReadLength)
//4
if numberOfBytesRead < 0 {
if let _ = stream.streamError {
break
}
}
//Construct the Message object
}
}
read(_:maxLength:)
which will read bytes from the stream and put them into the buffer you pass in.This method needs to be called in the case the input stream has bytes available, so go up to the Stream.Event.hasBytesAvailable
case in the switch statement inside stream(_:handle:)
and call the method you're working on below the print statement.
readAvailableBytes(stream: aStream as! InputStream)
At this point, you've got a sweet buffer full of bytes! Before you finish this method you'll need to write another helper to turn the buffer into a Message
object.
Put the following method definition below readAvailableBytes(_:)
.
private func processedMessageString(buffer: UnsafeMutablePointer<UInt8>,
length: Int) -> Message? {
//1
guard let stringArray = String(bytesNoCopy: buffer,
length: length,
encoding: .ascii,
freeWhenDone: true)?.components(separatedBy: ":"),
let name = stringArray.first,
let message = stringArray.last else {
return nil
}
//2
let messageSender:MessageSender = (name == self.username) ? .ourself : .someoneElse
//3
return Message(message: message, messageSender: messageSender, username: name)
}
String
using the buffer and length that's passed in.String
to free the buffer of bytes when it's done with them, and then split the incoming message on the : character so you can get the sender's name and the actual message as separate strings.Message
with the parts you've gathered and return it.To use your Message
construction method, add the following if-let
to the end of readAvailableBytes(_:)
.
if let message = processedMessageString(buffer: buffer, length: numberOfBytesRead) {
//Notify interested parties
}
At this point, you're all set to pass the Message
off to someone, but who?
Well, you really want to tell the ChatRoomViewController.swift about the new message, but you don't have a reference to it. Since it holds a strong reference to the ChatRoom
, you don't want to explicitly create a circular dependency and make a ChatRoomViewController
property.
This is the perfect time to set up a delegate protocol. The ChatRoom
doesn't care what kind of object wants to know about new messages, it just wants to tell someone.
Head to the top of ChatRoom.swift and add the simple protocol definition.
protocol ChatRoomDelegate: class {
func receivedMessage(message: Message)
}
Next, add a weak optional property to hold a reference to whoever decides to become the ChatRoom
's delegate.
weak var delegate: ChatRoomDelegate?
Now, you can go back and really complete readAvailableBytes(_:)
by adding the following inside the if-let
.
delegate?.receivedMessage(message: message)
To finish things off, go back to ChatRoomViewController.swift add the following extension that conforms to this protocol right below the MessageInputDelegate
extension.
extension ChatRoomViewController: ChatRoomDelegate {
func receivedMessage(message: Message) {
insertNewMessageCell(message)
}
}
Like I said earlier, the rest of the plumbing has already been set up for you, so insertNewMessageCell(_:)
will take your message and take care of adding the appropriate cell to the table.
Now, go and assign the view controller to be the chatRoom
's delegate by adding the following line right after the call to super in viewWillAppear(_:)
.
chatRoom.delegate = self
Once again, build and run, and then enter your name into the text field and hit enter.
The chat room now successfully shows a cell stating that you've entered the room. You've officially sent a message to and received a message from a socket-based TCP server.
Now that you've got the ChatRoom
class set up to send and receive messages, it's time to allow users to send actual text back and forth.
Go back over to ChatRoom.swift and add the following method to the bottom of the class definition.
func sendMessage(message: String) {
let data = "msg:\(message)".data(using: .ascii)!
_ = data.withUnsafeBytes { outputStream.write($0, maxLength: data.count) }
}
This method is just like the joinChat(_:)
method you wrote earlier, except it prepends msg to the text you send through to denote it as an actual message.
Since you want to send off messages when the inputBar
tells the ChatRoomViewController
that the user has hit Send, go back over to ChatRoomViewController.swift and find the MessageInputDelegate
extension.
Here, you'll see an empty method called sendWasTapped(_:)
that gets called at just such a time. To actually send the message, just pass it along to the chatRoom
.
chatRoom.sendMessage(message: message)
And that's actually all there is to it! Since the server will receive this message and then forward it back to everyone, the ChatRoom
will be notified of a new message the same way it is when you join the room.
Go ahead and build and run and try things out for yourself.
If you want to see someone chatting in return, go to a new terminal window and enter:
telnet localhost 80
This will allow you to connect to the TCP server on the command line. Now you can issue the same commands the app uses to chat from there.
iam:gregg
Then, send a message.
msg:Ay mang, wut's good?
Congrats, you've successfully written a chat client!
If you've ever done any programming with files, you should know that good citizens close files when they're done with them. Well turns out, like everything else in unix, an open socket connection is represented through a file handle, which means you need to close it when you're done, just like any other file.
To do so, add the following method after your definition of sendMessage(_:)
.
func stopChatSession() {
inputStream.close()
outputStream.close()
}
As you might have guessed, this closes the stream and makes it so information can't be sent or received. These calls also remove the streams from the run loop you scheduled them on earlier.
To finish things up, add this method call to the Stream.Event.endEncountered
case in the switch statement.
stopChatSession()
Then, go back to ChatRoomViewController.swift and add the same line to viewWillDisappear(_:)
.
stopChatSession()
And with that, you're done. Profectu tuo laetamur!
To download the completed chat app, click here.
Now that you've mastered (or at least seen a simple example of) the basics of networking with sockets, there are a few places to go to expand your horizons.
This streams tutorial is an example of communicating using TCP, which opens up a connection and guarantees packets will be delivered if possible. Alternatively, you can also use UDP, or datagram sockets to communicate. These sockets have no such guarantees, which means they're a lot faster and have less overhead. They're useful for applications like gaming. Ever experienced lag? That means you've got a bad connection and a lot of the UDP packets you should be receiving are getting dropped.
Another alternative to using HTTP for an application like this is a technology called WebSockets. Unlike traditional TCP sockets, WebSockets do at least maintain a relationship with HTTP and can be useful to achieve the same real-time communication goals as traditional sockets, all from the comfort and safety of the browser. Of course, WebSockets can be used with an iOS app as well, and we have just the tutorial if you're interested in learning more.
Finally, if you really want to dive deeper into networking, check out the free online book Beej's Guide to Network Programming. Questionable nickname choices aside, this book provides a really thorough and well-written explanation of socket programming. If you're afraid of C then this book may be a little intimidating, but then again, maybe today's the day you face your fears. ;]
I hope you enjoyed this streams tutorial, and as always, feel free to let me know if you have any questions or comments below!
The post Real-Time Communication with Streams Tutorial for iOS appeared first on Ray Wenderlich.