Part Thirteen: Rendering Textures

Once you’ve created an object in OpenGL, like our square made of two triangles, you can map a texture onto it. First, you need an image that you can load into your program, then you need to use texture coordinates to specify where on that texture your object lies.

For our sample program, I grabbed an Apple logo and resized it to 128 by 128.

 

apple_logo.png

 

We’ll use this as our texture for our formerly rainbow-colored square.

Texture coordinates, sometimes referred to as UV coordinates, are called ST coordinates by OpenGL. Just like we use X and Y coordinates to define where our triangle vertices are in OpenGL space, we will use S and T coordinates to specify where on the texture our shape(s) will be.

What does that mean? Remember how OpenGL space is centered around coordinate (0, 0), extending one unit in all directions? Texture coordinates don’t have anything to do with OpenGL space, they relate directly to your texture image. Texture coordinates, referred to as (S, T), start at (0, 0) at your texture’s lower left-hand corner, and go to (1, 1) at your texture’s upper right-hand corner.

 

 

Once I have my texture loaded into OpenGL, I create an array of texture coordinates to match the number of vertices that I’m using to create my object. In our case, we’re defining four vertices to create a square using OpenGL’s triangle strip drawing method. For each vertex that we pass in, we’ll also be passing in a texture coordinate to tell OpenGL where on the texture that vertex should be.

We know that our vertices are currently defined as the following.

    static const GLfloat squareVertices[] = {
        -0.5f, -0.5f,
        0.5f, -0.5f,
        -0.5f, 0.5f,
        0.5f, 0.5f,
    };

So in order to use a texture, I need to specify texture coordinates which will tell openGL where on the texture to put each vertex.

Let’s suppose I create the following array of texture coordinates.

    static const GLfloat squareTexCoords[] = {
        0, 0,
        1, 0,
        0, 1,
        1, 1
    };

With this array of texture coordinates, I’m telling OpenGL to put the first vertex (-0.5, -0.5) at the lower left-hand corner of the texture, (0, 0). The second vertex (0.5, -0.5) will be placed at texture coordinate (1, 0), or the lower right-hand corner of the texture. The third vertex (-0.5, 0.5) will be placed at the upper left-hand corner of the texture, (0, 1). Finally, the last vertex (0.5, 0.5) will be placed at the upper right-hand corner of the texture, (1, 1).

The texture (represented by the black lines) will line up with the triangle strip (the red lines) like this.

 

 

If we were to run this code, the texture would be applied as such.

 

iPhone screen

 

There’s our square, spinning around with the new texture applied with a perfect fit.

But to really understand how all of these coordinates relate to each other, let’s play around with them a bit.

We know that the texture runs from ST coordinates (0, 0) in the lower left-hand corner to (1, 1) in the upper right-hand corner. What happens if we make our square’s texture coordinates smaller than the texture space? Let’s try the following texture coordinates.

    static const GLfloat squareTexCoords[] = {
        0.25, 0.25,
        0.75, 0.25,
        0.25, 0.75,
        0.75, 0.75
    };

With these texture coordinates, we’re placing the square’s vertices about 0.25 units inside the texture. Instead of our vertices going all the way from (0, 0) to (1, 1) texture coordinates, we’re only going from (0.25, 0.25) to (0.75, 0.75). OpenGL will draw our triangle strip something like this.

 

 

See how the texture is now larger than the area covered by our square? Let’s see how this looks when we build and run it.

 

iPhone screen

 

Now our spinning square only shows as much of the texture as was visible in the area in which our square was placed in texture coordinates.

But what if the situation was reversed? What if our square went over the one square unit area that defines the texture, (0, 0) to (1, 1)? Let’s change those texture coordinates again.

    static const GLfloat squareTexCoords[] = {
        -1, -1,
        2, -1,
        -1, 2,
        2, 2
    };

This time I’m telling OpenGL that our square actually extends a full unit over the texture in all directions. While the texture runs from (0, 0) to (1, 1), our triangle strip has been mapped to the texture from (-1, -1) to (2, 2). OpenGL is going to map our triangle strip to our texture like this.

 

 

So now there’s not enough texture to cover our surface area. What do you think OpenGL will do?

 

iPhone screen

 

By default, OpenGL will repeat (tile) your texture over the surface if the surface if larger than the texture based on your texture coordinates.

Pretty cool stuff.

When you create your images to use as textures under OpenGL ES 2.0 on the iPhone, you should size them to be a width and height of some power of two. In other words, even though the width and height don’t have to match, they should each be 2, 4, 8, 16, 32, 64, 128, 256, 512, 1024, or 2048. Keep in mind though, the iPhone screen is not huge, so smaller texture sizes should work just fine, and will stay within the ‘best practices’ recommendation from Apple.

Let’s get started adding textures to our code.

First, we’re going to add an instance variable and a method definition in our EDCubeDemoViewController.h file.

#import <UIKit/UIKit.h>

#import <OpenGLES/EAGL.h>

#import <OpenGLES/ES1/gl.h>
#import <OpenGLES/ES1/glext.h>
#import <OpenGLES/ES2/gl.h>
#import <OpenGLES/ES2/glext.h>

@interface EDCubeDemoViewController : UIViewController {
@private
    EAGLContext *context;
    GLuint program;
   
    BOOL animating;
    NSInteger animationFrameInterval;
    CADisplayLink *displayLink;
   
    GLuint textureName; // ED: Added
}

@property (readonly, nonatomic, getter=isAnimating) BOOL animating;
@property (nonatomic) NSInteger animationFrameInterval;

- (void)startAnimation;
- (void)stopAnimation;
- (void)loadTexture:(GLuint *)textureName fromFile:(NSString *)fileName; // ED: Added

@end

The new instance variable, textureName, will be loaded with the ‘name’ of our texture when we load it at initialization time. Loading textures is an expensive operation in OpenGL, and not something that you want to do a lot of during the rendering process. By loading this variable once, at initializations time, we can use it over and over at render time without having to pay for loading it each time.

The new method, also called at initialization time, is what we’ll use to load that new instance variable.

In the EDCubeDemoViewController.m file, we need to start by modifying our attribute and uniform arrays.

// Uniform index.
enum {
    UNIFORM_MVP_MATRIX,
    UNIFORM_TEXTURE, // ED: Added
    NUM_UNIFORMS
};
GLint uniforms[NUM_UNIFORMS];

// Attribute index.
enum {
    ATTRIB_VERTEX,
    //ATTRIB_COLOR, // ED: Removed
    ATTRIB_TEXTURE_COORD, // ED: Added
    NUM_ATTRIBUTES
};

The new UNIFORM_TEXTURE uniform will make our texture image data available to our fragment shader. The ATTRIB_COLOR attribute will no longer be used, and has been removed, and the new ATTRIB_TEXTURE_COORD attribute will allow us to pass our texture coordinates into the vertex shader.

Next, in the awakeFromNib method, we’ll add a message to our new method.

- (void)awakeFromNib
{
    EAGLContext *aContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
   
    if (!aContext) {
        aContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES1];
    }
   
    if (!aContext)
        NSLog(@"Failed to create ES context");
    else if (![EAGLContext setCurrentContext:aContext])
        NSLog(@"Failed to set ES context current");
   
    self.context = aContext;
    [aContext release];
   
    [(EAGLView *)self.view setContext:context];
    [(EAGLView *)self.view setFramebuffer];
   
    if ([context API] == kEAGLRenderingAPIOpenGLES2)
        [self loadShaders];
   
    animating = FALSE;
    animationFrameInterval = 1;
    self.displayLink = nil;
   
    [self loadTexture:&textureName fromFile:@"apple_logo.png"]; // ED: Added
}

That last line is the new one. If you’re following along in your own code, make sure you get a PNG file from somewhere to add to your project and name it ‘apple_logo.png’, or download the project code from this chapter and grab the file.

We have quite a few changes in our drawFrame method.

- (void)drawFrame
{
    [(EAGLView *)self.view setFramebuffer];
   
    static const GLfloat squareVertices[] = {
        -0.5f, -0.5f,
        0.5f, -0.5f,
        -0.5f, 0.5f,
        0.5f, 0.5f,
    };
   
    /* ED: Removed
    static const GLubyte squareColors[] = {
        255, 255,   0, 255,
        0,   255, 255, 255,
        0,     0,   0,   0,
        255,   0, 255, 255,
    };
     */

   
    // ED: Added - the texture coordinates determine where on the texture the object will fall
    static const GLfloat squareTexCoords[] = {
        0, 0,
        1, 0,
        0, 1,
        1, 1
    };
   
    static float transY = 0.0f;
    static float transX = 0.0f;
   
    GLfloat mvpMatrix[16];
   
    glClearColor(0.5f, 0.5f, 0.5f, 1.0f);
    glClear(GL_COLOR_BUFFER_BIT);
   
    if ([context API] == kEAGLRenderingAPIOpenGLES2) {
        glEnable(GL_TEXTURE_2D); // ED: Added
       
        glBindTexture(GL_TEXTURE_2D, textureName); // ED: Added
       
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); // ED: Addd
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); // ED: Added

        // Use shader program.
        glUseProgram(program);
       
        [EDMatrixTools applyIdentity:mvpMatrix];
       
        [EDMatrixTools applyScale:mvpMatrix x:0.5f y:0.5f z:1.0f];
        [EDMatrixTools applyRotation:mvpMatrix x:0 y:0 z:transY];
        [EDMatrixTools applyTranslation:mvpMatrix x:(cosf(transX) / 2.0f) y:(sinf(transY) / 2.0f) z:0.0f];
       
        [EDMatrixTools applyProjection:mvpMatrix aspect:1.5];
               
        // Update uniform value.
        glUniformMatrix4fv(uniforms[UNIFORM_MVP_MATRIX], 1, 0, mvpMatrix);
        glUniform1i(uniforms[UNIFORM_TEXTURE], 0); // ED: Added
        transY += 0.075f;  
        transX += 0.075f;
       
        // Update attribute values.
        glVertexAttribPointer(ATTRIB_VERTEX, 2, GL_FLOAT, 0, 0, squareVertices);
        glEnableVertexAttribArray(ATTRIB_VERTEX);
        //glVertexAttribPointer(ATTRIB_COLOR, 4, GL_UNSIGNED_BYTE, 1, 0, squareColors); // ED: Removed
        //glEnableVertexAttribArray(ATTRIB_COLOR); // ED: Removed
        glVertexAttribPointer(ATTRIB_TEXTURE_COORD, 2, GL_FLOAT, 0, 0, squareTexCoords);
        glEnableVertexAttribArray(ATTRIB_TEXTURE_COORD);
       
        // Validate program before drawing. This is a good check, but only really necessary in a debug build.
        // DEBUG macro must be defined in your debug configurations if that's not already the case.
#if defined(DEBUG)
        if (![self validateProgram:program]) {
            NSLog(@"Failed to validate program: %d", program);
            return;
        }
#endif
    }
   
    glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
   
    [(EAGLView *)self.view presentFramebuffer];
}

Let’s look at each of the changes.

    /* ED: Removed
    static const GLubyte squareColors[] = {
        255, 255,   0, 255,
        0,   255, 255, 255,
        0,     0,   0,   0,
        255,   0, 255, 255,
    };
     */

   
    // ED: Added - the texture coordinates determine where on the texture the object will fall
    static const GLfloat squareTexCoords[] = {
        0, 0,
        1, 0,
        0, 1,
        1, 1
    };

Since we’re using a texture now, and not colors, we’ve removed the color values array and added a texture coordinates array.

We’re going to line up each of our square’s four corners with the four corners of the texture.

        glEnable(GL_TEXTURE_2D); // ED: Added
       
        glBindTexture(GL_TEXTURE_2D, textureName); // ED: Added
       
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); // ED: Addd
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); // ED: Added

These functions set up the texture for use by OpenGL. The glEnable() function enables an OpenGL feature, in this case, 2D textures.

The glBindtexture() function binds our texture, represented by the textureName variable, to the OpenGL target GL_TEXTURE_2D. There’s also a GL_TEXTURE_1D, a GL_TEXTURE_3D, and a GL_TEXTURE_CUBE_MAP, but we’re only using a 2D texture here.

Once the texture is bound to the GL_TEXTURE_2D target, we use the glTexParameteri() functions to change the settings for the bound texture. In this case, the GL_TEXTURE_MIN_FILTER setting tells OpenGL how to handle the case where pixels in the texture needs to be shrunk down to fit into a smaller space. By specifying GL_LINEAR, we are telling OpenGL to look at the pixels around the condensed space to figure our which color would look best there.

Similarly, the GL_TEXTURE_MAG_FILTER setting tells OpenGL how to handle the case where the area that needs to be filled in is larger than the texture data for that area. The GL_LINEAR option here will also look at the surrounding pixels to determine the best color to use for that area.

Keep in mind that when OpenGL is looking for texture data, it’s often looking for it one pixel at a time, so if the pixel it’s trying to color in on the screen doesn’t map to a single pixel color in the texture, it has to either make up a new one or somehow condense several pixel colors into one pixel color. That’s why we need to tell OpenGL how to process these cases.

        glUniform1i(uniforms[UNIFORM_TEXTURE], 0); // ED: Added

When texture image data is passed into a shader, it is known as sampler data. In our fragment shader, we created a uniform variable of the sampler2D type. Since OpenGL knows that this uniform holds sampler (texture image) data, calling glUniform1i() on this uniform variable will set the detail level of the image data, in this case 0, which represents the base image data that we loaded in at initialization time.

        //glVertexAttribPointer(ATTRIB_COLOR, 4, GL_UNSIGNED_BYTE, 1, 0, squareColors); // ED: Removed
        //glEnableVertexAttribArray(ATTRIB_COLOR); // ED: Removed
        glVertexAttribPointer(ATTRIB_TEXTURE_COORD, 2, GL_FLOAT, 0, 0, squareTexCoords);
        glEnableVertexAttribArray(ATTRIB_TEXTURE_COORD);

Finally, we remove the code that passes in the colors array, since we’re no longer using it, and add code to now pass in the array of texture coordinates.

Now that we have a new shader uniform and a change to the shader attributes, we need to update our loadShader method to reflect the changes. First, let’s look at the entire method, and then we’ll go over the changes.

- (BOOL)loadShaders
{
    GLuint vertShader, fragShader;
    NSString *vertShaderPathname, *fragShaderPathname;
   
    // Create shader program.
    program = glCreateProgram();
   
    // Create and compile vertex shader.
    vertShaderPathname = [[NSBundle mainBundle] pathForResource:@"Shader" ofType:@"vsh"];
    if (![self compileShader:&vertShader type:GL_VERTEX_SHADER file:vertShaderPathname])
    {
        NSLog(@"Failed to compile vertex shader");
        return FALSE;
    }
   
    // Create and compile fragment shader.
    fragShaderPathname = [[NSBundle mainBundle] pathForResource:@"Shader" ofType:@"fsh"];
    if (![self compileShader:&fragShader type:GL_FRAGMENT_SHADER file:fragShaderPathname])
    {
        NSLog(@"Failed to compile fragment shader");
        return FALSE;
    }
   
    // Attach vertex shader to program.
    glAttachShader(program, vertShader);
   
    // Attach fragment shader to program.
    glAttachShader(program, fragShader);
   
    // Bind attribute locations.
    // This needs to be done prior to linking.
    glBindAttribLocation(program, ATTRIB_VERTEX, "position");
    //glBindAttribLocation(program, ATTRIB_COLOR, "color"); // ED: Removed
    glBindAttribLocation(program, ATTRIB_TEXTURE_COORD, "texture_coord"); // ED: Added
   
    // Link program.
    if (![self linkProgram:program])
    {
        NSLog(@"Failed to link program: %d", program);
       
        if (vertShader)
        {
            glDeleteShader(vertShader);
            vertShader = 0;
        }
        if (fragShader)
        {
            glDeleteShader(fragShader);
            fragShader = 0;
        }
        if (program)
        {
            glDeleteProgram(program);
            program = 0;
        }
       
        return FALSE;
    }
   
    // Get uniform locations.
    uniforms[UNIFORM_MVP_MATRIX] = glGetUniformLocation(program, "mvp_matrix");
    uniforms[UNIFORM_TEXTURE] = glGetUniformLocation(program, "texture"); // ED: Added
   
    // Release vertex and fragment shaders.
    if (vertShader)
        glDeleteShader(vertShader);
    if (fragShader)
        glDeleteShader(fragShader);
   
    return TRUE;
}

This method is responsible for binding our shader attributes before linking them, and retrieving the shader uniform locations afterwards. Since we’ve replaced the ATTRIB_COLOR attribute with the ATTRIB_TEXTURE_COORD attribute (and the ‘color’ shader attribute with the ‘texture_coord’ attribute), we need to update the section that calls the glBindAttribLocation() function.

    // Bind attribute locations.
    // This needs to be done prior to linking.
    glBindAttribLocation(program, ATTRIB_VERTEX, "position");
    //glBindAttribLocation(program, ATTRIB_COLOR, "color"); // ED: Removed
    glBindAttribLocation(program, ATTRIB_TEXTURE_COORD, "texture_coord"); // ED: Added

Here, we remove the binding to the vertex shader’s ‘color’ attribute and add a binding for the new ‘texture_coord’ attribute.

   // Get uniform locations.
    uniforms[UNIFORM_MVP_MATRIX] = glGetUniformLocation(program, "mvp_matrix");
    uniforms[UNIFORM_TEXTURE] = glGetUniformLocation(program, "texture"); // ED: Added

After linking the shaders, we then add a call to glGetUniformLocation() for our new ‘texture’ uniform.

The next change to the EDCubeDemoViewController.m file is the addition of a method to load in our image data and create a texture that OpenGL can use.

// ED: Added method to read in graphics file as texture
- (void)loadTexture:(GLuint *)newTextureName fromFile:(NSString *)fileName {
    // Load image from file and get reference
    UIImage *image = [[UIImage alloc] initWithContentsOfFile:[[NSBundle mainBundle] pathForResource:fileName ofType:nil]];
    CGImageRef imageRef = [image CGImage];
   
    if(imageRef) {
        // get width and height
        size_t imageWidth = CGImageGetWidth(imageRef);
        size_t imageHeight = CGImageGetHeight(imageRef);
       
        GLubyte *imageData = (GLubyte *)malloc(imageWidth * imageHeight * 4);
        memset(imageData, 0, (imageWidth * imageHeight * 4));
       
        CGContextRef imageContextRef = CGBitmapContextCreate(imageData, imageWidth, imageHeight, 8, imageWidth * 4, CGImageGetColorSpace(imageRef), kCGImageAlphaPremultipliedLast);
       
        // Make CG system interpret OpenGL style texture coordinates properly by inverting Y axis
        CGContextTranslateCTM(imageContextRef, 0, imageHeight);
        CGContextScaleCTM(imageContextRef, 1.0, -1.0);
       
        CGContextDrawImage(imageContextRef, CGRectMake(0.0, 0.0, (CGFloat)imageWidth, (CGFloat)imageHeight), imageRef);
       
        CGContextRelease(imageContextRef);
       
        glGenTextures(1, newTextureName);
       
        glBindTexture(GL_TEXTURE_2D, *newTextureName);
       
        glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, imageWidth, imageHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, imageData);
       
        free(imageData);
       
        glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
    }
   
    [image release];        
}

Let’s look at this method bit by bit.

    // Load image from file and get reference
    UIImage *image = [[UIImage alloc] initWithContentsOfFile:[[NSBundle mainBundle] pathForResource:fileName ofType:nil]];
    CGImageRef imageRef = [image CGImage];

Here we’re using the UIImage class to load our image file from disk. The UIImage class is capable of reading image data in several popular formats, including TIFF, JPEG, GIF, and PNG. Once we have a UIImage object, we can get to the underlying Quartz image data by using the CGImage property, which returns a CGImageRef object.

    if(imageRef) {
        // get width and height
        size_t imageWidth = CGImageGetWidth(imageRef);
        size_t imageHeight = CGImageGetHeight(imageRef);
       
        GLubyte *imageData = (GLubyte *)malloc(imageWidth * imageHeight * 4);
        memset(imageData, 0, (imageWidth * imageHeight * 4));

If we get the CGImageRef data that we asked for, we process it . The first thing we do is get the width and height of the image, and use that information to allocate a block of memory to hold the image data that we’re going to give to OpenGL.

        CGContextRef imageContextRef = CGBitmapContextCreate(imageData, imageWidth, imageHeight, 8, imageWidth * 4, CGImageGetColorSpace(imageRef), kCGImageAlphaPremultipliedLast);

The goal of this line of code is to create an area in memory for our image data that is compatible with OpenGL.

The CGBitmapContextCreate function takes a pointer to a memory area as the first parameter, and formats it based on the parameters that follow it. The width and height are pretty self-explanatory, and the parameter after that is the bits to use to each pixel in the image. With 8 bits, we can have up to 32 values per pixel. The next parameter, imageWidth * 4, is the number of bytes of memory needed per row. The second to last parameter uses the CGImageGetColorSpace() to get the color space of the image we’ve been working with. Finally, the last parameter is a set of bit flags that we can use to set attributes for the new image. In this case, we’re only specifying kCGImageAlphaPremultipliedLast, which causes iOS to apply the alpha values to the color components beforehand (premultiplied) to save time.

        // Make CG system interpret OpenGL style texture coordinates properly by inverting Y axis
        CGContextTranslateCTM(imageContextRef, 0, imageHeight);
        CGContextScaleCTM(imageContextRef, 1.0, -1.0);

On the iPhone, these two lines will save you a lot of grief.

Remember how the iPhone sees (0, 0) as the upper left-hand corner of the screen extending downwards? That’s what Apple calls the QuickDraw coordinate space, and it’s the coordinate space used when we’re reading touches off of the screen. When we use Apple’s Core Graphics library, it sees the screen coordinate (0, 0) as the lower left-hand corner.

When you use some of the higher level Apple classes, the Core Graphics coordinate system is translated automatically for you, so you can properly read touch locations and the like in the coordinate space we’re used to, with (0, 0) in the upper left-hand corner.

However, when get deeper into the Core Graphics library, like we’re doing here, the coordinate space is flipped along the Y axis, which means I’ll be drawing my image relative to (0, 0) in the lower left-hand corner. When iOS helpfully converts the two coordinate spaces later, my images will be flipped.

The CGContextTranslateCTM() and CGContextScaleCTM() functions instruct Core Graphics to flip the Y axis, effectively drawing the image upside down as far as Core Graphics is concerned. When iOS converts between the coordinate systems for use later the image will be oriented properly.

        CGContextDrawImage(imageContextRef, CGRectMake(0.0, 0.0, (CGFloat)imageWidth, (CGFloat)imageHeight), imageRef);
       
        CGContextRelease(imageContextRef);

Here we draw the image using the ImageContextRef we created earlier. After this call, our imageData variable will be loaded and ready to use with OpenGL. Once the call is complete, we’re finished with the imageContextRef variable, and can decrement its usage counter.

        glGenTextures(1, newTextureName);
       
        glBindTexture(GL_TEXTURE_2D, *newTextureName);
       
        glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, imageWidth, imageHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, imageData);
       
        free(imageData);

The glGenTextures() function creates a space for some number of OpenGL textures. In this case, we only want 1, as indicated by the first parameter, and we’re going to store the new name in the newTextureName variable.

The glBindTexture() function binds the new texture to a texture target. The texture targets are the same ones we saw earlier, GL_TEXTURE_1D, GL_TEXTURE_2D, GL_TEXTURE_3d, and GL_TEXTURE_CUBE_MAP.

The glTexImage2D() function call loads the image into OpenGL and makes it available to the shaders.

Finally, since we’ve already handed the image data to OpenGL, we can free the memory we allocated earlier for this data.

   }
   
    [image release];

That’s the end of the conditional block of processing, and we release the UIImage object that we allocated at the beginning of this method.

With all of the code changes we’ve made so far, we now have a texture image being loaded and processed in our EDCubeDemoViewController code. Now we just need to modify the shaders to actually use that data.

In the vertex shader, we need to make the following changes.

attribute vec4 position;
//attribute vec4 color; // ED: Removed
attribute vec4 texture_coord;

//varying vec4 colorVarying; //ED: Removed
varying vec2 texCoordVarying;

uniform mat4 mvp_matrix;

void main()
{
    gl_Position = mvp_matrix * position;

    //colorVarying = color; // ED: Removed
    texCoordVarying = texture_coord.st; // ED: Added
}

Since we’re no longer using colors, we can remove all of the code that used that data. In its place, we’ll add the texture related attribute and varying.

The texture_coord variable will hold the texture coordinate for the current vertex being processed, and the texCoordVarying will pass the interpolated texture fragment data into the shader.

In the shader code, we load the texCoordVarying variable with the S and T texture coordinates.

In the fragment shader, we’ll make the following changes.

//varying lowp vec4 colorVarying; // ED: Removed
varying highp vec2 texCoordVarying; // ED: Added

uniform sampler2D texture; // ED: Added

void main()
{
    //gl_FragColor = colorVarying; // ED: Removed
    gl_FragColor = texture2D(texture, texCoordVarying); // ED: Added
}

The code dealing with color has been removed, and the new variable for the interpolated texture data, texCoordVarying, has been added.

A new uniform named ‘texture’ has been added too, with a type of sampler2D. The sampler2D type is a special OpenGL type that indicates this uniform variable is a texture.

In the shader code, we can use the texture2D function to get the texture data for the interpolated texture coordinate and apply it to our current fragment.

That’s the last change, so let’s run our code and see what happens.

 

iPhone screen

 

Great.

I think we’ve had enough of this square, so let’s finally put a cube into the EDCubeDemo project.

Part Twelve | Index | Part Fourteen

3 Comments to “Part Thirteen: Rendering Textures”

  1. By Teodor, May 16, 2011 @ 1:10 pm

    Hi Joe,

    Great tutorial series, however, in this particular one you forgot to mention the changes needed in loadShaders: method.

    We should also bind the attribute location for the texture_coord (ATTRIB_TEXTURE_COORD) and, of course, remove the previous binding of the “color” attribute.

    glBindAttribLocation(program, ATTRIB_TEXTURE_COORD, “texture_coord”);

    Cheers,
    Teodor

  2. By Joe, May 16, 2011 @ 3:16 pm

    You’re absolutely right, Teodor – thanks for pointing that out! I’ll start working on a section to cover the changes to the loadShader method right now and update this article as soon as it’s ready.

    Fortunately, in the project available for download, it’s coded correctly, I just forgot to include the changes in the tutorial text.

    Edit: The changes are in. My apologies to anyone who might have had problems because of this. I do always make sure that the projects compile and run before making them available for download, so if anyone ever has a problem after following the tutorial text, be sure to download the project and see if that works.

    Thanks again, Teodor.

    • By Teodor, May 17, 2011 @ 1:24 am

      Don’t worry, Joe.
      It was actually a useful exercise from my part to fix it in my working project. I usually went through your tutorials step by step without downloading the complete project at the end.

      Anyway, it’s time for “more fun with OpenGL” :)

      Teodor