Quantcast
Channel: Kodeco | High quality programming tutorials: iOS, Android, Swift, Kotlin, Unity, and more
Viewing all articles
Browse latest Browse all 4384

iOS Metal Tutorial with Swift Part 4: Lighting

$
0
0
Learn how to add lighting into your Metal apps!

Learn how to add lighting into your Metal apps!

Welcome back to our iOS Metal tutorial series!

In the first part of the series, you learned how to get started with Metal and render a simple 2D triangle.

In the second part of the series, you learned how to set up a series of transformations to move from a triangle to a full 3D cube.

In the third part of the series, you learned how to add a texture to the cube.

In this forth part of the series, you’ll learn how to add some lighting to the cube. As you work through this tutorial, you’ll learn:

  • Some basic light concepts
  • Phong light model components
  • How to calculate light effect for each point in the scene, in shaders

Without further ado, it’s time to get Metal!

Getting Started

Before you begin, you need to understand how lighting works.

Lightning means applying light generated from light sources to rendered objects. That’s how our real world works; light sources (like the sun or lamps) produce light, and rays of these lights collide with environment and illuminate it. Our eyes can then see this environment and we have a picture rendered on our eyes retina.

In our real world there are multiple light sources. Those light sources work like this:

source_of_light

Rays are emitted in all directions from the light source.

The same rule applies to our biggest light source – sun. However, when we take into account the huge distance between the Sun and the Earth, it’s safe to treat the small percentage of rays emitted from the Sun that actually collide with Earth as parallel rays.

parallel

For this tutorial, you’ll use only one light source with parallel rays, just like those of the sun. This is called a directional light and is commonly used in 3D games.

Phong Lighting Model

There are various algorithms used to shade objects based on light sources, but one of the most popular is called the phong lighting model.

This model is popular for a good reason. Not only is it quite simple to implement and understand, but it’s also quite performant and looks great!

The phong lighting model consist of three components:

32_a

  1. Ambient Lighting: Represents light that hits an object from all directions. You can think of this as light bouncing around a room.
  2. Diffuse Lighting: Represents light that is brighter the more a surface is facing the light source. Of rall three components, I’d argue this is the most important part for the visual effect.
  3. Specular Lighting: Represents light that causes a bright spot for the small area that is directly toward the light source. You can think of this as a bright spot on a shiny piece of metal.

You will learn more about each of these components as you implement them in this tutorial.

Project Setup

It’s time to code! Start by downloading the starter project for this tutorial. It’s exactly where we finished in the previous tutorial, except for that now it is updated for Xcode 7 and Swift 2.0.

Run it on a Metal-compatible iOS device, just to be sure it works correctly. You should see the following:

IMG_4274

This represents a 3D cube. It looks great except all areas of the cube are currently lit evenly so it looks a bit flat. Let’s improve the looks through the power of lighting!

Ambient Lighting Overview

Remember that ambient lighting highlights all surfaces in the scene the same amount, no matter where the surface is located, which direction the surface is facing, or what the light direction is.

To calculate ambient lighting, you need two parameters:

  1. Light color: Light can be different colors. For example, if a light is red, it will tint each object the light hits red. For this tutorial you will use a plain white color for the light. White light is a common choice, since white doesn’t tint the material of the object any differently.
  2. Ambient intensity: This is a value that represents the strength of the light. The higher the value, the stronger the illumination of the scene will be.

Once you have those parameters, you can calculate the ambient lighting as follows:

Ambient color = Light color * Ambient intensity

Let’s give this a shot in code!

Adding Ambient Lighting

Creating a Light Structure

First, you need a structure to store light data.

Add a new Swift file to your project named Light.swift and replace its contents with the following:

import Foundation
 
struct Light {
 
  var color: (Float, Float, Float)  // 1
  var ambientIntensity: Float       // 2
 
  static func size() -> Int {       // 3
    return sizeof(Float) * 4
  }
 
  func raw() -> [Float] {
    let raw = [color.0, color.1, color.2, ambientIntensity]   // 4
    return raw
  }
}

Let’s review this section by section:

  1. A property that stores the light color in red, green, and blue.
  2. Stores the intensity of the ambient effect.
  3. This is a convenience function to get size of Light structure.
  4. This is a convenience function to convert the structure to an array of floats. You’ll use this and the size() function to send the light data to the GPU.

This is similar to Vertex structure that you created in part 2 of this series.

Now open Node.swift and add the following constant to the class:

let light = Light(color: (1.0,1.0,1.0), ambientIntensity: 0.2)

This creates a white light with a low intensity (0.2).

Passing the Light Data to the GPU

Next you need to pass this light data to the GPU. You already include the projection and model matrices in uniform buffer; you’ll modify this to include the light data too.

To do this, open BufferProvider.swift, and in init(device:inflightBuffersCount) replace this line of code:

let sizeOfUniformsBuffer = sizeof(Float) * (2 * Matrix4.numberOfElements())

…with this one:

let sizeOfUniformsBuffer = sizeof(Float) * (2 * Matrix4.numberOfElements()) + Light.size()

Here you have increased the size of uniform buffers, so that now you have room for the light data.

Now change this method declaration:

func nextUniformsBuffer(projectionMatrix: Matrix4, modelViewMatrix: Matrix4) -> MTLBuffer

…to be like this:

func nextUniformsBuffer(projectionMatrix: Matrix4, modelViewMatrix: Matrix4, light: Light) -> MTLBuffer

Here you added an extra parameter for the light data. Now inside this same method, find these lines:

memcpy(bufferPointer, modelViewMatrix.raw(), sizeof(Float)*Matrix4.numberOfElements())
memcpy(bufferPointer + sizeof(Float)*Matrix4.numberOfElements(), projectionMatrix.raw(), sizeof(Float)*Matrix4.numberOfElements())

…and add this line just below:

memcpy(bufferPointer + 2*sizeof(Float)*Matrix4.numberOfElements(), light.raw(), Light.size())

With this additional memcpy() call, you copy light data to the uniform buffer, just as you did with with projection and modelView matrices.

Modifying the Shaders to Accept the Light Data

Now that the data is passed to GPU, you need to modify your shader to use it. To do this, open Shaders.metal and below the VertexOut structure, add the a new structure for the light data you pass across:

struct Light{
  packed_float3 color;
  float ambientIntensity;
};

Now modify the Uniforms structure to contain Light, as follows:

struct Uniforms{
  float4x4 modelMatrix;
  float4x4 projectionMatrix;
  Light light;
};

At this point, you can access light data inside of the vertex shader. However, you also need this data in fragment shader.

To do this, change the fragment shader declaration to match this:

fragment float4 basic_fragment(VertexOut interpolated [[stage_in]],
                               const device Uniforms&  uniforms    [[ buffer(1) ]],
                               texture2d<float>  tex2D     [[ texture(0) ]],
                               sampler           sampler2D [[ sampler(0) ]])

This adds the uniform data as the second parameter.

Now, before you forget, open Node.swift, and inside render(_:pipelineState:drawable:parentModelViewMatrix:projectionMatrix:clearColor:) find this line:

renderEncoder.setVertexBuffer(uniformBuffer, offset: 0, atIndex: 1)

…and add this line right underneath:

go to the Node file into the render method, and add this:

renderEncoder.setFragmentBuffer(uniformBuffer, offset: 0, atIndex: 1)

By adding this code, you pass uniform buffer as a parameter not only to the vertex shader, but to the fragment shader as well.

While you’re in this method, you’ll notice an error on this line:

let uniformBuffer = bufferProvider.nextUniformsBuffer(projectionMatrix, modelViewMatrix: nodeModelMatrix)

To fix this error, you need to pass the light data to the buffer provider. To do this, replace that line with the following:

let uniformBuffer = bufferProvider.nextUniformsBuffer(projectionMatrix, modelViewMatrix: nodeModelMatrix, light: light)

Take a step back to make sure you understand what you’ve done so far. At this point, that you’ve passed Light data from CPU to GPU, and more specifically, to the fragment shader. This is very similar to how you passed matrices to GPU in previous parts.

Make sure you understand the flow, because you will pass some more data later, in a similar fashion.

Adding the Ambient Light Calculation

Now return to the fragment shader in Shaders.metal. Add these lines to the top of the fragment shader:

// Ambient
Light light = uniforms.light;
float4 ambientColor = float4(light.color * light.ambientIntensity, 1);

This retrieves the light data from the uniforms, and uses the values to calculate the ambientColor using the algorithm discussed earlier.

Now that you have calculated ambientColor, replace the last line of the method as follows:

return color * ambientColor;

This multiplies the color of the material by the calculated ambient color.

That’s it! Build and run the app and you’ll see the following:

IMG_4275

What’s Going On?

As you can see, it’s really dark right now. You might ask yourself: “Adding ambient light just made my scene darker! Is that really the way it works?!”

Darkness

Although it may seem strange, the answer is “Yes”!

Another way of looking at it is that without any light, everything is pitch black. But by adding a small amount of ambient light, you have highlighted your objects slightly, as-if it was the early dawn of a morning.

You might also wonder, “Why hasn’t the background changed?”. And the answer for that is simple: The vertex shader runs on all scene geometry, but the background is not geometry. In fact, it’s not even a background, it’s just a constant color which the GPU uses for places where nothing is drawn.

The green color, despite being quintessence of awesomeness, doesn’t really suit anymore for a clear color role. So in Node.swift inside render(_:pipelineState:drawable:parentModelViewMatrix:projectionMatrix:clearColor:) find this line:

renderPassDescriptor.colorAttachments[0].clearColor = MTLClearColor(red: 0.0, green: 104.0/255.0, blue: 5.0/255.0, alpha: 1.0)

…and replace it with the following:

renderPassDescriptor.colorAttachments[0].clearColor = MTLClearColor(red: 0.0, green: 0.0, blue: 0.0, alpha: 1.0)

Build and run, and you’ll see the following:

IMG_4276

Now it looks a lot less confusing!

Diffuse Lighting Overview

Introducing Normals

To calculate diffuse lighting, you need to know which direction each vertex is facing. You do this by associating a normal with each vertex.

So what is normal of a vertex? It’s a vector which is perpendicular to the surface that the vertex is a part of.

Take a look at this picture so that it becomes more visual:

38_a

You will store the normal of each vertex in the Vertex structure, similarly to how you store texture coordinates or position values.

Introducing Dot Products

There is also one important topic left to cover before the actual implementation of diffuse lighting – the dot product of vectors.

The dot product is a mathematical function between two vectors, such that:

  • When the vectors are parallel: the dot product of them is equal 1.
  • When the vectors are opposite directions: the dot product of them is equal -1.
  • When the angle between the vectors is 90°: the dot product is equal to 0.

Screen Shot 2015-08-14 at 12.59.39 AM

This will come in handy shortly.

Introducing Diffuse Lighting

Now when you have normals and you understand dot product, let’s see how to implement diffuse lighting.

Remember that diffuse lighting is brighter if the normal of a vector is facing toward the light, and (you guessed it) weaker otherwise.

To calculate diffuse lighting, you need two parameters:

  1. Light color: You need the color of the light, similar to ambient lighting. In this tutorial, you’ll use the same color for all types of light (ambient, diffuse, and specular), but you can make them different if you want.
  2. Diffuse intensity: This is a value similar to Ambient intensity; the bigger it is, the stronger the diffuse effect is.
  3. Diffuse factor: This is a value that you can get by taking the dot product between the light direction vector and vertex normal. The smaller the angle between those two vectors, the higher this value, and the stronger diffuse lighting effect should be.

You can calculate the diffuse lighting as follows:

<em>Diffuse Color = Light Color * Diffuse Intensity * Diffuse factor</em>.

33_a

On the picture, you can see dot products of various point of the object, depending on how they are pointing related to the light direction; this represents the diffuse factor. The higher the diffuse factor, the brighter the diffuse light.

Now let’s dive into the implementation!

Adding Diffuse Lighting

Adding Normal Data

First things first, you need to add normal data to Vertex.

Open Vertex.swift and find these properties:

var s,t: Float       // texture coordinates

Below those properties, add the following properties:

var nX,nY,nZ: Float  // normal

Now modify func floatBuffer() to look like this:

func floatBuffer() -> [Float] {
    return [x,y,z,r,g,b,a,s,t,nX,nY,nZ]
}

This adds the new normal properties to the buffer of floats.

Now open Cube.swift and change vertices to match those:

//Front
let A = Vertex(x: -1.0, y:   1.0, z:   1.0, r:  1.0, g:  0.0, b:  0.0, a:  1.0, s: 0.25, t: 0.25, nX: 0.0, nY: 0.0, nZ: 1.0)
let B = Vertex(x: -1.0, y:  -1.0, z:   1.0, r:  0.0, g:  1.0, b:  0.0, a:  1.0, s: 0.25, t: 0.50, nX: 0.0, nY: 0.0, nZ: 1.0)
let C = Vertex(x:  1.0, y:  -1.0, z:   1.0, r:  0.0, g:  0.0, b:  1.0, a:  1.0, s: 0.50, t: 0.50, nX: 0.0, nY: 0.0, nZ: 1.0)
let D = Vertex(x:  1.0, y:   1.0, z:   1.0, r:  0.1, g:  0.6, b:  0.4, a:  1.0, s: 0.50, t: 0.25, nX: 0.0, nY: 0.0, nZ: 1.0)
 
//Left
let E = Vertex(x: -1.0, y:   1.0, z:  -1.0, r:  1.0, g:  0.0, b:  0.0, a:  1.0, s: 0.00, t: 0.25, nX: -1.0, nY: 0.0, nZ: 0.0)
let F = Vertex(x: -1.0, y:  -1.0, z:  -1.0, r:  0.0, g:  1.0, b:  0.0, a:  1.0, s: 0.00, t: 0.50, nX: -1.0, nY: 0.0, nZ: 0.0)
let G = Vertex(x: -1.0, y:  -1.0, z:   1.0, r:  0.0, g:  0.0, b:  1.0, a:  1.0, s: 0.25, t: 0.50, nX: -1.0, nY: 0.0, nZ: 0.0)
let H = Vertex(x: -1.0, y:   1.0, z:   1.0, r:  0.1, g:  0.6, b:  0.4, a:  1.0, s: 0.25, t: 0.25, nX: -1.0, nY: 0.0, nZ: 0.0)
 
//Right
let I = Vertex(x:  1.0, y:   1.0, z:   1.0, r:  1.0, g:  0.0, b:  0.0, a:  1.0, s: 0.50, t: 0.25, nX: 1.0, nY: 0.0, nZ: 0.0)
let J = Vertex(x:  1.0, y:  -1.0, z:   1.0, r:  0.0, g:  1.0, b:  0.0, a:  1.0, s: 0.50, t: 0.50, nX: 1.0, nY: 0.0, nZ: 0.0)
let K = Vertex(x:  1.0, y:  -1.0, z:  -1.0, r:  0.0, g:  0.0, b:  1.0, a:  1.0, s: 0.75, t: 0.50, nX: 1.0, nY: 0.0, nZ: 0.0)
let L = Vertex(x:  1.0, y:   1.0, z:  -1.0, r:  0.1, g:  0.6, b:  0.4, a:  1.0, s: 0.75, t: 0.25, nX: 1.0, nY: 0.0, nZ: 0.0)
 
//Top
let M = Vertex(x: -1.0, y:   1.0, z:  -1.0, r:  1.0, g:  0.0, b:  0.0, a:  1.0, s: 0.25, t: 0.00, nX: 0.0, nY: 1.0, nZ: 0.0)
let N = Vertex(x: -1.0, y:   1.0, z:   1.0, r:  0.0, g:  1.0, b:  0.0, a:  1.0, s: 0.25, t: 0.25, nX: 0.0, nY: 1.0, nZ: 0.0)
let O = Vertex(x:  1.0, y:   1.0, z:   1.0, r:  0.0, g:  0.0, b:  1.0, a:  1.0, s: 0.50, t: 0.25, nX: 0.0, nY: 1.0, nZ: 0.0)
let P = Vertex(x:  1.0, y:   1.0, z:  -1.0, r:  0.1, g:  0.6, b:  0.4, a:  1.0, s: 0.50, t: 0.00, nX: 0.0, nY: 1.0, nZ: 0.0)
 
//Bot
let Q = Vertex(x: -1.0, y:  -1.0, z:   1.0, r:  1.0, g:  0.0, b:  0.0, a:  1.0, s: 0.25, t: 0.50, nX: 0.0, nY: -1.0, nZ: 0.0)
let R = Vertex(x: -1.0, y:  -1.0, z:  -1.0, r:  0.0, g:  1.0, b:  0.0, a:  1.0, s: 0.25, t: 0.75, nX: 0.0, nY: -1.0, nZ: 0.0)
let S = Vertex(x:  1.0, y:  -1.0, z:  -1.0, r:  0.0, g:  0.0, b:  1.0, a:  1.0, s: 0.50, t: 0.75, nX: 0.0, nY: -1.0, nZ: 0.0)
let T = Vertex(x:  1.0, y:  -1.0, z:   1.0, r:  0.1, g:  0.6, b:  0.4, a:  1.0, s: 0.50, t: 0.50, nX: 0.0, nY: -1.0, nZ: 0.0)
 
//Back
let U = Vertex(x:  1.0, y:   1.0, z:  -1.0, r:  1.0, g:  0.0, b:  0.0, a:  1.0, s: 0.75, t: 0.25, nX: 0.0, nY: 0.0, nZ: -1.0)
let V = Vertex(x:  1.0, y:  -1.0, z:  -1.0, r:  0.0, g:  1.0, b:  0.0, a:  1.0, s: 0.75, t: 0.50, nX: 0.0, nY: 0.0, nZ: -1.0)
let W = Vertex(x: -1.0, y:  -1.0, z:  -1.0, r:  0.0, g:  0.0, b:  1.0, a:  1.0, s: 1.00, t: 0.50, nX: 0.0, nY: 0.0, nZ: -1.0)
let X = Vertex(x: -1.0, y:   1.0, z:  -1.0, r:  0.1, g:  0.6, b:  0.4, a:  1.0, s: 1.00, t: 0.25, nX: 0.0, nY: 0.0, nZ: -1.0)

This adds a normal to each vertex.

If you don’t understand those normal values, try to sketch a cube on a piece of paper, and for each vertex write its normal vertex value. You will get the same numbers as me!

To think about it, it really makes sense, all vertices that are on the same face, have the same normal values.

Build and run, and you’ll see the following:

IMG_4277

Woooooooooooow!

giphyad

If glitches like this are not a good reason to learn 3D graphics, then what is? :]

Do you have any idea what went wrong?

Passing the Normal Data to the GPU

At this point your vertex structure includes normal data, but your shader does not expect this data.

Therefore, the shader reads position data for next vertex where normal data from the previous vertex is stored. That’s why you end up with this weird look.

To fix this, open Shaders.metal. In VertexIn structure, add this below all the other components:

packed_float3 normal;

Build and run. Voila! Cube looks just like expected.

IMG_4278

Adding Diffuse Lighting Data

Right now Light structures don’t have all needed data for diffuse lighting, so let’s go add some.

In Shaders.metal, add two new values to the bottom of the Light structure:

packed_float3 direction;
float diffuseIntensity;

Now open Light.swift and add these properties below ambientIntensity:

var direction: (Float, Float, Float)
var diffuseIntensity: Float

Also modify the methods to look like this:

static func size() -> Int {
    return sizeof(Float) * 8
}
 
func raw() -> [Float] {
    let raw = [color.0, color.1, color.2, ambientIntensity, direction.0, direction.1, direction.2, diffuseIntensity]
    return raw
}

Nothing fancy, you’ve just added two properties, used them when getting the raw float array and increased the size value.

Next open Node.swift and add modify the light constant to match this:

let light = Light(color: (1.0,1.0,1.0), ambientIntensity: 0.2, direction: (0.0, 0.0, 1.0), diffuseIntensity: 0.8)

The direction that you pass (0.0, 0.0, 1.0) is a vector pointing perpendicularly to the screen. This mean that light is pointing the same way as the camera. You also set the diffuse intensity to a largish amount (0.8).

Adding the Diffuse Lighting Calculation

Now let’s actually use the normal data. Right now you have normal data in the vertex shader, but you need the interpolated normal for each fragment. So you need to pass normal data to VertexOut.

To do this, open Shaders.metal and inside VertexOut add this below the other components:

float3 normal;

Then in the vertex shader find this line:

VertexOut.texCoord = VertexIn.texCoord;

…and add this right below:

VertexOut.normal = (mv_Matrix * float4(VertexIn.normal, 0.0)).xyz;

This way you will get normal value for each fragment in a fragment shader.

Now in the fragment shader, add this right after the ambient color part:

//Diffuse
float diffuseFactor = max(0.0,dot(interpolated.normal, light.direction)); // 1
float4 diffuseColor = float4(light.color * light.diffuseIntensity * diffuseFactor ,1.0); // 2

Let’s discuss this line by line:

  1. Here you calculate diffuse factor. There is some math involved, so let’s go through it from right to left:
    1. You take dot product of fragment normal and light direction.
    2. As discussed previously this will return value from -1 to 1, depending on angle between the two normals.
    3. You need this value to be capped from 0 to 1, so you use max function to force any negative value into 0.
  2. Here’s how you get diffuse color, you multiply light color with diffuse intensity and diffuse factor. You also set alpha to 0 and make it float4 value.

Almost done, change the last line in fragment shader from:

return color * ambientColor;

…to:

return color * (ambientColor + diffuseColor);

Build and run, and you’ll see the following:

IMG_4279

Looking good, eh? For even better look, find this line in Node.swift:

let light = Light(color: (1.0,1.0,1.0), ambientIntensity: 0.2, direction: (0.0, 0.0, 1.0), diffuseIntensity: 0.8)

And change the ambient intensity to 0.1:

let light = Light(color: (1.0,1.0,1.0), ambientIntensity: 0.1, direction: (0.0, 0.0, 1.0), diffuseIntensity: 0.8)

Build and run again and there will be less ambient light, making the diffuse effect more noticeable:

IMG_4281

As you can see, the more face is pointed toward the light source, the brighter it becomes.

LikeABoss

Specular Lighting Overview

Specular lighting is the third and final component of the Phong lighting model.

Remember you can think of this component as the one that exposes shininess of objects.
This is kind of like when light falls on metallic objects, you can see a small extremely shiny spot.

You calculate the specular color in a similar fashion to how you calculate the diffuse color:

SpecularColor = LightColor * SpecularIntensity * SpecularFactor

Specular intensity, just like diffuse and ambient intensity, is the value which you can modify to get that perfect look that you want.

The real question is, what is Specular factor? For that, have a look at the following picture:

37_a

This illustration you have a light ray that hits a vertex. The vertex has a normal (n), and the light reflects off the vertex in a particular direction (r). The question is: how close is that reflection vector to the vector that points toward the camera?

  1. The more this reflected vector is pointed towards the camera, the more shiny you want this point to be.
  2. The farther this vector is from the camera the darker the fragment should become, and unlike diffuse lighting you want this to happen fairly quickly, to get this cool metallic effect.

To calculate SpecularFactor you use our good buddy dot product:

SpecularFactor = - (r * eye)shininess

After you get the dot product of reflected vector and eye vector, you raise it to a new value – shininess. Shininess is a material parameter – for example wooden objects will have less shininess than metallic objects.

Adding Specular Lighting

Adding Specular Lighting Data

First things first, open Light.swift and add two properties below the others:

var shininess: Float
var specularIntensity: Float
Note: Shininess is not a parameter of light, it’s more like a parameter of object material. But for the sake of this tutorial you will keep it simple and pass it with Light.

As always, don’t forget to modify the methods to include the new values:

static func size() -> Int {
    return sizeof(Float) * 10
}
 
func raw() -> [Float] {
    let raw = [color.0, color.1, color.2, ambientIntensity, direction.0, direction.1, direction.2, diffuseIntensity, shininess, specularIntensity]
    return raw
}

Now in Node.swift change the light constant value to this:

let light = Light(color: (1.0,1.0,1.0), ambientIntensity: 0.1, direction: (0.0, 0.0, 1.0), diffuseIntensity: 0.8, shininess: 10, specularIntensity: 2)

Now open Shaders.metal and add this to its Light structure:

float shininess;         
float specularIntensity;

Build and run.

ragecomic

Crash?!

Byte Alignment

The problem you faced is a bit complicated, so listen carefully. In Light structure size method returns 10 * sizeOf(Float) = 40 bytes.

In Shaders.metal, Light structure shuld also be 40 bytes, because that’s exactly the same structure. Right?

Right, but that’s not how GPU works. GPU operates with memory chunks with size 16 bytes.

Replace Light structure in Shaders.metal with this, for some illustration:

struct Light{
  packed_float3 color;      // 0 - 2
  float ambientIntensity;          // 3
  packed_float3 direction;  // 4 - 6
  float diffuseIntensity;   // 7
  float shininess;          // 8
  float specularIntensity;  // 9
 
  /*
  _______________________
 |0 1 2 3|4 5 6 7|8 9    |
  -----------------------
 |       |       |       |
 | chunk0| chunk1| chunk2|
  */
};

Even though you have 10 floats, GPU is still allocating memory for 12 floats. And you have a mismatch sizes error.

All you need to do to fix this crash is to increase the Light structure size to match those 3 chunks (12 floats).

Open Light.swift and change the size() method to return 12 instead of 10:

static func size() -> Int {
  return sizeof(Float) * 12
}

Build and run. Everything should be working OK.

IMG_4281

Adding the Specular Lighting Calculation

Now that you have the data passed through, it’s time for the calculation itself.

Open Shaders.metal, and add the following value to the VertexOut struct, right below position:

float3 fragmentPosition;

And in the vertex shader find this line:

VertexOut.position = proj_Matrix * mv_Matrix * float4(VertexIn.position,1);

…and add this line just below it:

VertexOut.fragmentPosition = (mv_Matrix * float4(VertexIn.position,1)).xyz;

This new value “fragment position” does just what it says, it’s a fragment position related to a camera. You will use this value to get the eye vector.

Now add the following under the diffuse calculations in the fragment shader:

//Specular
float3 eye = normalize(interpolated.fragmentPosition); //1
float3 reflection = reflect(light.direction, interpolated.normal); // 2
float specularFactor = pow(max(0.0, dot(reflection, eye)), light.shininess); //3
float4 specularColor = float4(light.color * light.specularIntensity * specularFactor ,1.0);//4

This is the same algorithm you learned about earlier:

  1. Get the eye vector.
  2. Calculate reflection vector of the light across the current fragment.
  3. Calculate the specular factor.
  4. Combine all the values above to get the specularColor.

Now with specular color, modify return line in fragment shader to match:

return color * (ambientColor + diffuseColor + specularColor);

Build and run.

IMG_4282

Enjoy your new shiny object!

Where To Go From Here?

Here is the final example project from this iOS Metal Tutorial.

Nicely done. Take a moment to review what you’ve done in this tutorial.

  1. You created a Light structure to send with matrices in uniform buffer to the GPU.
  2. Modified BufferProvider class to handle Light data.
  3. You implemented ambient lighting, diffuse lighting, and specular lighting.
  4. You learned how GPU handles memory, and fixed the crash.

Now go for a walk or take a nap – you totally deserve some rest! :]

Don’t feel tired? We hope to make more Metal tutorials in the future, but in the meantime, be sure to check out some of these great resources:

Also tune into the OpenGL ES video tutorials on this site, and learn as Ray explains — in depth — how many of these same concepts work in OpenGL ES.

Thank you for joining me for this tour through Metal. As you can see, it’s a powerful technology that’s relatively easy to implement once you understand how it works.

If you have questions, comments or Metal discoveries to share, please leave them in the comments below!

The post iOS Metal Tutorial with Swift Part 4: Lighting appeared first on Ray Wenderlich.


Viewing all articles
Browse latest Browse all 4384

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>