First of all, what is OpenGL, anyway? According to Wikipedia, OpenGL is “a standard specification defining a cross-language, cross platform API for writing applications that produce 2D and 3D computer graphics.” The specifications for OpenGL are currently maintained by the Khronos Group, which is a group of people from various companies in related industries.
The full OpenGL library is used by computers to drive the high-powered graphics cards that we put in our machines to play the latest video games and heat up our rooms. But for smaller devices like the iPhone, we use OpenGL ES, which is a subset of the full OpenGL specification. The ‘ES’ stands for Embedded Systems, but don’t think that means it’s not powerful. OpenGL ES on the iPhone is capable of some amazing things, and if you’ve played around with Epic Citadel from Epic Games, you know what I mean.
Originally, the iPhone supported only OpenGL ES 1.1, which you’ll often hear described as a ‘fixed-pipeline’ system, while the more recently-supported OpenGL ES 2.0 is described as having a ‘programmable-pipeline’. But what does that mean to a developer, and why does it matter?
In OpenGL ES 1.1, the developer was given a library of functions to set data values for the rendering process, such as altering data in matrices for positional and rotational data, and setting up lighting and camera parameters. Once you had specified the available parameters, OpenGL would go off and do its thing, rendering the scene in a particular way using the data provided by the programmer to tweak certain things. Life was good, but a little predictable, because the rendering process (or pipeline) was ‘fixed’.
In OpenGL ES 2.0, all those handy functions for setting up the rendering parameters are gone, and instead we use ‘shaders’, which are little programs that OpenGL will run when it’s time to render something. OpenGL no longer cares what you want it to do with the camera, or the lights, or the objects you have defined. The way these things are rendered are no longer ‘fixed’, they are now altered programmatically, through the shaders that we code, at the appropriate steps in the rendering process (or pipeline), hence the phrase ‘programmable-pipeline’.
With a little extra work, graphics programmers can now do a lot more during the rendering process, and have much more control over how OpenGL will render things.
We’ll learn more about shaders when we go through the template that Xcode uses for its OpenGL ES Application projects.
Rather than spend a lot of time right now going over all of the conceptual bits and pieces of how OpenGL ES 2.0 works, let’s dive right into the Xcode project template and save the explanations for the relevant sections of code. Putting the concepts together with the actual code has always worked best for me, so that’s how I’ll construct this tutorial.