13.07.2015 Views

Rendering, OpenGL, and Lighting - Caltech

Rendering, OpenGL, and Lighting - Caltech

Rendering, OpenGL, and Lighting - Caltech

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Spaces• Object space (also called model space):a convenient space to model the object– Coordinates of an object relative to itscenter


Spaces• Camera space: camera placed at origin,generally looking down -Z axis with +Yup <strong>and</strong> +X to the rightC


Spaces• (Clipped) Normalized DeviceCoordinates (NDC)– Camera still looking at -Z, +Y up, +X right– Space scaled to (x, y, z) each in [-1, 1]• Can think of x, y as screen coordinates


Spaces• Screen space (pixel coordinates)– Just a scaling of NDC• As we said, NDC represents pixel coordinates– Screen transform maps [-1, 1] x [-1, 1] to[0, screenWidth] x [0, screenHeight]– Once in screen space, it's simple to drawthe pixels– This is usually done in hardware; we won'tthink too much about it here


SpacesNominally, the spaces are related as such:Graphics engine doesthese steps internallyModelspaceModelspaceModelspaceWorldtransformsWorldspaceCameratransformProjectiontransformNDCPixel/viewporttransformPixelcoordinatesRasterizationModelspaceCameraspaceRenderedimage


Spaces<strong>OpenGL</strong> eliminates world spaceGraphics engine doesthese steps internallyModelspaceModelspaceModelspaceModelviewtransformsCameraspaceProjectiontransformNDCPixel/viewporttransformPixelcoordinatesRasterizationModelspaceRenderedimage


Transformations• Matrix multiplication for object, world, cameratransformations• Take V = (x, y, z, 1) We use the 4th dimension (1) to make translations easy– Set up a stack of matrix transformations O 1 , O 2 , …O n– Multiply together in reverse order to applytransformation• That is, V final = O 1 O 2 …O n V– Note: rotation <strong>and</strong> scaling commute; translationdoes NOT commute• TRS = TSR; TRS != RTS != RST


Transformations• All transformations realized as 4x4 matrices• Translation by (x, y, z):– See why the 4thcoordinate is useful?• Scaling by (a, b, c):– Scale x by a, y by b,z by c, that is


Transformations• Rotation is more complicated…• Let v = (x, y, z) with |v| = 1 be the axis ofrotation <strong>and</strong> a be the angle of rotation• We define R with:– S =– Then R =– Note that R -1 = R T


Transformations• Projection transformation (cameraspace to NDC)


Transformations• Projection transformation– n = near, f = far, l = left, r = right, t = top, b= bottom


Transformations• Projection: just add it to the beginning ofthe stack– V final = PO 1 O 2 …O n V• If V final = (x, y, z, w) then V final,NDC =(x/w, y/w, z/w, 1), clipped to [-1, 1]


Rasterization• Draw triangles from NDCvertices• Draw pixels inside triangleboundary, interpolatingdata at boundaries– Barycentric coordinates• Initial pixel color interpolated from vertex data– Can modify with GPU shader code


Buffers• Z-buffer– Store the (linearly interpolated) depth of eachpixel, in NDC– Don’t draw pixels if a nearer pixel has alreadybeen drawn– Easy method to ensure that further objects aren’tdrawn over nearer objects• Double buffering– Render to one buffer while the other is displayedonscreen– Swap buffers when done drawing


<strong>OpenGL</strong>• So what does <strong>OpenGL</strong> do?– Implements the basic rendering pipeline• Works as a massive state machine to modifyrendering properties– Change its state in order to render– When a property is set, it won’t change until it’sreset by the code– Some properties: vertex color, lighting mode,transparency mode


<strong>OpenGL</strong> transformation pipeline• Vertices in object space transformed to camera space bymodelview matrix (combines world <strong>and</strong> camera matrices)• Viewport transformation converts NDC to pixel coordinates(based on the window/screen size)


<strong>OpenGL</strong> transformation stacks• Stacks of matrices usable to storeprevious transformations• Two stacks: modelview <strong>and</strong> projectionModelviewmatrix stack(32 4x4 matrices)Projectionmatrix stack(2 4x4 matrices)


<strong>OpenGL</strong> transformation stacks• Push <strong>and</strong> pop transformations– i.e. render an arm, push– translate to h<strong>and</strong>, draw palm, push• translate to <strong>and</strong> draw finger 1, pop, push• translate to <strong>and</strong> draw finger 2, pop, push• …– pop, translate to other h<strong>and</strong>…


<strong>OpenGL</strong> transformations• Function calls to build 4x4 matrices <strong>and</strong>place them on the stack– glTranslate3f(x, y, z)– glRotate3f(a, x, y, z)• a in degrees; x, y, z don’t have to benormalized– glScale3f(x, y, z)– glFrustum(l, r, b, t, n, f)• Sets up the projection transformation


<strong>OpenGL</strong> transformations• Can set up projection more easily usingGLU (GL Utility) library• gluPerspective(fovy, aspect, near, far)aspect = w/h


<strong>OpenGL</strong> geometry/rendering• How do we use these transformations torender triangles?// set up transformations...glBegin(GL_TRIANGLES);glVertex3f(x1, y1, z1);glVertex3f(x2, y2, z2);glVertex3f(x3, y3, z3);// more sets of 3 vertices for more trianglesglEnd();• Add glNormal3f(xi, yi, zi) calls beforeglVertex3f calls to attach normals to vertices


<strong>OpenGL</strong> geometry/rendering• glShade(GL_SMOOTH / GL_FLAT)– Tells <strong>OpenGL</strong> whether to use per-vertex lighting(Gouraud) or per-face lighting (flat)– For fixed pipeline, this won’t matter when we useGPU shaders (we'll override the default lighting)• <strong>Lighting</strong> functions to set up light state– Same for material properties– Generally done during initialization; take a look athomework 1 code to see how it’s done– Light <strong>and</strong> material state can be read in GPU code,so it’s still useful


How to render with <strong>OpenGL</strong>• Initialize– Set up lights, materials, buffering, window size,camera/projection matrices, etc.• Redraw– Clear current backbuffer (from double-buffersetup, it’s the one not being displayed)– Set up transformations, glBegin,glNormal3f/glVertex3f, glEnd– Swap buffers


<strong>OpenGL</strong> Utility Toolkit (GLUT)• Very simple windowing systeminterfaced with <strong>OpenGL</strong>• Sets up callback functions that arecalled when events occur– For example, mouse movement,keystrokes, window redrawing• We’ll use GLUT for the assignments– But we’ll give you most of that code


Simple lighting models• Flat, Gouraud, Phong– Per-face, per-vertex, <strong>and</strong> per-pixelapplications of the same lighting equation


Simple lighting models• The rendered color of an object is thesum of several components– Ambient is simple: it’s a static light


Simple lighting models• The rendered color of an object is thesum of several components– Diffuse <strong>and</strong> specular represent reflectionsgraphic from Wikipedia


Diffuse lighting• Diffuse component equation:– I d = C d * cos(a)– C d is diffuse color, L is vector from light, Nis normal vector at surface– Can find a using N . L N La


Specular lighting• Specular component equation:– I s = C s * (R . Eye) S– Cs is specular color, S is ‘shininess’ ofobject, Eye is vector to camera, R isreflection of L about N• Or, R = 2N (L . N) - L– All vectors must benormalized!EyeRbaNaL


Overall lighting model• <strong>Lighting</strong> equation: Color = I a + I d + I s– I a = ambient color– I d = diffuse color– I s = specular color


Homework 1• Implement per-pixel lighting using GLSL• Should be a short homework; mainly tomake sure GLSL is working foreveryone• Due by class time next Wednesday– Don’t forget to include a readme file…


Recitation 1CS 179: GPU ProgrammingLecture originally written by Luke Durant, Tamas Szalay, Russell McClellan


Recitation contents• Explanation of lab 1• Phong shading model• Introduction to GLSL– Basic GLSL syntax


Lab 1• Create a per-pixel phong renderer• Use GLSL to create vertex <strong>and</strong> pixelshaders


Simple lighting models• Diffuse component equation• I d = C d * cos(a)– C d is the diffuse color, L is the light vector,<strong>and</strong> N is the normal vector– Can find a using N . LNLa


Simple lighting models• Specular component equation• I s = C s * (R . Eye) S– C s is the specular color, S is the shininess– R = 2N(L . N) - L, or use reflect()– Eye = -Pos (if in eye-space)• All vectors must be normalized!RNLEyebaa


Simple lighting models• Overall lighting equation• Color = I a + I d + I s– I a is ambient intensity– I d is diffuse intensity– I s is specular intensity


Blinn shading• <strong>OpenGL</strong> uses a simplified versioncalled Blinn shading, <strong>and</strong> onlycomputes lighting on the vertices• Intensity proportional to (H . N) S


Blinn shading


Phong shading (vertices)


Phong shading (pixels)


GLSL• GLSL is a high-level language for writing GPU programs• Two most important types of GLSL shaders: vertex shaders <strong>and</strong>fragment shaders– There are also geometry shaders, etc. but we won't use these• These shaders override specific parts of the graphics pipeline


Default <strong>OpenGL</strong> pipelineVertex dataWorld space to NDC conversion, per-vertex shadingRasterization <strong>and</strong> interpolationPer-pixel coloration


Shader pipelineVertex dataVertex shaderRasterization <strong>and</strong> interpolationFragment shader


Notes on GLSL pipeline• The vertex <strong>and</strong> pixel shader pipelinewas designed for graphics– There are some issues with using it fornon-graphics computation• Since shaders replace existing parts ofthe pipeline, often need to re-implement<strong>and</strong> extend default algorithms– However, the GLSL environment makesthis pretty easy


Vertex shaders• Vertex shaders are responsible for allper-vertex calculations• In the default pipeline:– Convert vertices from object space to NDC– Perform all lighting/texture coordinatecalculations


Fragment shaders• Fragment shaders are responsible forall per-pixel calculations• Before the frag shader, all per-vertexdata is interpolated by the rasterizer• In the default pipeline:– Simple sets the output color of the pixel tothe interpolated vertex color


Vertex <strong>and</strong> fragment shader callsVertex shader runsonce for every vertexFragment shader runsonce for every pixelrendered in the scene,with vertex datainterpolated


GLSL syntax <strong>and</strong> programming• GLSL has very similar syntax to C– main() is entry point for both vert <strong>and</strong> fragshaders• Some extra language support for vector<strong>and</strong> matrix operations• Lots of built-in functions <strong>and</strong> variables


GLSL types• Floating-point: float, vec2, vec3, vec4• Integer: int, ivec2, ivec3, ivec4• Boolean: bool, bvec2, bvec3, bvec4• Matrices: mat2, mat3, mat4


GLSL built-in variables• H<strong>and</strong>y reference guide to all built-infunctions <strong>and</strong> variables:– http://www.khronos.org/files/opengl-quickreference-card.pdf– Bookmark this!– It's a good reference for <strong>OpenGL</strong> ingeneral, so it may be useful for the nextfew weeks


Vertex shader built-in variables• Inputs:– gl_Vertex: object space position– gl_Normal: object space normal• Outputs:– gl_FrontColor: write vertex color to thisvariable (or don't, if you do this in thefragment shader)– gl_Position: write output NDC position


Fragment shader built-in variables• Input:– gl_Color: interpolated per-vertex color• Output:– gl_FragColor: write pixel color here


Both shader variables• Inputs:– gl_LightModelParameters• gl_LightModelParameters.ambient– gl_LightSource[0].diffuse, etc.– gl_FrontMaterial.diffuse, etc.– gl_ModelViewMatrix– gl_ProjectionMatrix– gl_NormalMatrix (use instead of modelviewmatrix to transform normals to world space)


Built-in functions• dot() - takes the dot product of two vectors• sin, cos, pow, etc. - similar to math.h in C– Angles are in radians• ftransform() - perform fixed transform• reflect() - reflect a vector about another vector• Many more! See GLSL reference


Communication between shaders• What if shader need more informationthan given by the built-ins?– Special keywords…• Vertex -> fragment: varying• CPU -> both (per render): uniform• CPU -> both (per vertex): attribute• CPU -> both (large data): use a texture– We'll see this next week…


Sample vertex shaderuniform int ambientEnabled;varying vec3 normal, lightDir;void main() {normal = normalize(gl_NormalMatrix * gl_Normal);vec3 worldPos = vec3(gl_ModelViewMatrix * gl_Vertex);lightDir = normalize(vec3(gl_LightSource[0].position) - worldPos);gl_Position = ftransform();}


Sample fragment shaderuniform int ambientEnabled; // same as in vertex shadervarying vec3 normal, lightDir; // same as in vertex shadervoid main() {vec4 color = vec4(0, 0, 0, 0);if (ambientEnabled)color = gl_LightModel.ambient * gl_FrontMaterial.ambient;float NdotL = max(dot(normalize(normal),normalize(lightDir)), 0.0);color += (gl_FrontMaterial.diffuse * gl_LightSource[0].diffuse *NdotL);gl_FragColor = color;}


Using shaders• Example included in lab 1– Can use that code in future projects• p = glCreateProgram();• v = glCreateShader(GL_VERTEX_SHADER);• glShaderSource(v, 1, &src, NULL);• glAttachProgram(p, v);• glAttachProgram(p, f);• glLinkProgram(p);• glUseProgram(p);


Setting uniforms• L = glGetUniformLocation(p, "variableName");• glUniform1f(L, value);– Use glUniform1i for integer uniforms, etc.• Shaders must declare the variable outside of themain() function using the uniform keyword

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!