1 graphics, day 2 entering the third dimension based on lecture by ed angel © ed angel uncredited...
TRANSCRIPT
1
Graphics, Day 2Entering the Third Dimension
Based on lecture by Ed Angel
© Ed Angel
Uncredited images from Ed Angel
2
Objectives
Brief historical introduction
Meet our first Pioneer – Ivan Sutherland
Fundamental imaging notions
Physical basis for image formation
Light
Color
Perception
Synthetic camera model
Other models – Ray Tracing
Escape to 3D!
Serpinski Gasket in 3D
Shader Programs
Mathematics: meaning of Dot product, applications
aleksandarrodic.com/p/jellyfish/
3
Basic Graphics System
Input devices
Output device
Image formed in Frame Buffer
4
CRT
Can be used either as a line-drawing device (calligraphic) or to display contents of frame buffer (raster mode)
5
Computer Graphics: 1950-1960
Computer graphics goes back to the earliest days of computing
Strip charts
Pen plotters
Simple displays using A/D converters to go from computer to calligraphic CRT
Cost of refresh for CRT too high
Computers slow, expensive, unreliable
API very device specific
6
Computer Graphics: 1960-1970
Wireframe graphics
Draw only lines
Ivan Sutherland's Sketchpad
Display Processors
Storage tube
wireframe representationof sun object
7
Sketchpad
Ivan Sutherland’s PhD thesis at MIT (1963)
Recognized the potential of man-machine interaction
Loop
Display something
User moves light pen
Computer generates new display
Sutherland also created many of the widely used algorithms for computer graphics
Alan Kay presents Sutherland's sketchpad
http://www.youtube.com/watch?v=495nCzxM9PI
8
Display Processor
Rather than have the host computer refresh display, use a special purpose computer called a display processor (DPU)
Graphics stored in display list on display processor
Host compiles display list and sends to DPU
9
Direct View Storage Tube
Created by Tektronix
Did not require constant refresh
Standard interface to computers
Allowed for standard software
Plot3D in Fortran
Relatively inexpensive
Opened door to use of computer graphics for CAD community
Drew lines - vector graphics
10
Computer Graphics: 1970-1980
Raster Graphics
Beginning of graphics standards
IFIPS
GKS: European effort
Becomes ISO 2D standard
Core: North American effort
3D but fails to become ISO standard
Workstations and PCs
11
Raster Graphics
Image produced as an array (raster) of picture elements (pixels) in the frame buffer – special purpose memory read by the display
12
Raster Graphics
Allows us to go from wire frame images to filled polygons
Note different patches have different shading
Can see edges and vertices
13
PCs and Workstations
Although we no longer make the distinction between workstations and PCs, historically they evolved from different roots
Early workstations characterized by
Networked connection: client-server model
High-level of interactivity
Early PCs included frame buffer as part of user memory
Easy to change contents of frame buffer and create images
14
Computer Graphics: 1980-1990
Realism comes to computer graphics
smooth shading environment mapping
bump mapping
15
Bump Map
Change surface plane
Normal vector
Note edge is smooth
16
Computer Graphics: 1980-1990
Special purpose hardware
Silicon Graphics geometry engine
VLSI implementation of graphics pipeline
Industry-based standards
PHIGS
Pixar's RenderMan API
Networked graphics: X Window System
Human-Computer Interface (HCI)
17
Computer Graphics: 1990-2000
OpenGL API
Computer-generated movies in theaters
New hardware capabilities
Texture mapping
Blending
Accumulation, stencil buffers
Pixar
18
Computer Graphics: 2000-
Photorealism
Graphics cards for PCs dominate market
Nvidia, ATI, 3DLabs
Game boxes and game players determine direction of market
Computer graphics routine in movie industry: Maya, Lightwave
Programmable pipelines
Grand Theft Auto
19
Image Formation
In computer graphics, we form images which are generally two dimensional using a process analogous to how images are formed by physical imaging systems
Cameras
Microscopes
Telescopes
Human visual system
Wikipedia
20
Elements of Image Formation
Objects
Viewer
Light source(s)
How light interacts with material
Glossy or matt?
Note independence of objects, viewer, and light source(s)
Generating shadows is not easy – a theme we will return to
21
Light
Light is the part of the electromagnetic spectrum that causes a reaction in our visual systems
Generally these are wavelengths in the range of about 350-750 nm (nanometers)
Long wavelengths appear as reds and short wavelengths as blues
22
Luminance and Color Images
Luminance Image
Monochromatic
Values are gray levels
Analogous to working with black and white film
Color Image
Has perceptional attributes of hue, saturation, and lightness
Do we have to match every frequency in visible spectrum? No!
23
Three-Color Theory
Human visual system has two types of sensors
Rods: monochromatic, night vision
Cones
Color sensitive
Three types of cones
Only three values (the tristimulus
values) are sent to the brain
Need only match these three values to fool eye
Need only three primary colors
24
Shadow Mask CRT
25
Additive and Subtractive Color
Additive colorForm a color by adding amounts of three primaries
CRTs, projection systems, positive filmPrimaries are Red (R), Green (G), Blue (B)
Subtractive colorForm a color by filtering white light with cyan (C),
Magenta (M), and Yellow (Y) filtersLight-material interactionsPrintingNegative film
26
Synthetic Camera Model
center of projection
image plane
projector
p
projection of p
27
Advantages
Separation of objects, viewer, light sources
2D graphics is a special case of 3D graphics
Leads to simple software API
Specify objects, lights, camera, attributes
Let implementation determine image
Leads to fast hardware implementation
28
Global vs Local Lighting
Limits:
Cannot compute color or shade of each object independently
Some objects are blocked from light
Light can reflect from object to object
Some objects might be translucent
29
Alternatives
Ray Tracing and Ray Casting
Wikipedia
30
Why not ray tracing?
Ray tracing is more physically based Simpler for objects such as polygons and quadrics.Can produce global lighting effects such as shadows and
multiple reflections Why don’t we use it to design a graphics system?Ray tracing is slow and not well-suited for many interactive
applications
Wikipedia
Return to Sierpinski Gasket
Discovered by Waclaw Sierpinski in 1915
Start with a triangle
In each step, remove 25% of the remaining area
We are left with a shape with area 0
Sierpinski Gasket
The odd numbers in Pascal’s Triangle are in bold
Can view trios of three bold numbers as a black triangle
See also S. Wolfram’s Cellular Automata rule 90
1
1 1
1 2 1
1 3 3 1
1 4 6 4 1
1 5 10 10 5 1
1 6 15 20 15 6 1
1 7 21 35 35 21 7 1
33
Shader Defintions
Last time we looked at gasket1.html
gasket1v2 gives the same image, but the shader programs are not listed in the .html file
Programs are stored in subdirectory$ diff gasket1*html
...
28c8
< <script type="text/javascript" src="../Common/initShaders.js"></script>
---
> <script type="text/javascript" src="../Common/initShaders2.js"></script>
34
Entering the 3rd Dimension
Today's sample program
Sierpinski's pyramid
What do you see?
What must we do?
35
Gasket4 –minor changes
% diff gasket4.html gasket2.html
6c6
< <title>3D Sierpinski Gasket</title>
---
> <title>2D Sierpinski Gasket</title>
...
// Major changes...
40c35
< <script type="text/javascript" src="gasket4.js"></script>
---
> <script type="text/javascript" src="gasket2.js"></script>
Major changes
% diff gasket4.html gasket2.html
...
13,15c13
< attribute vec3 vPosition;
< attribute vec3 vColor;
< varying vec4 color;
---
> attribute vec4 vPosition;
20,21c18
< gl_Position = vec4(vPosition, 1.0);
< color = vec4(vColor, 1.0);
---
> gl_Position = vPosition;
27,28d23
<
< varying vec4 color;
33c28
< gl_FragColor = color;
---
> gl_FragColor = vec4( 0.0, 0.0, 1.0, 1.0 );
...
We have different colorsCan no longer use fixed color
37
Gasket4 vertex shader
<script id="vertex-shader" type="x-shader/x-vertex">
attribute vec3 vPosition;
attribute vec3 vColor;
varying vec4 color;
void
main()
{
gl_Position = vec4(vPosition, 1.0);
color = vec4(vColor, 1.0);
}
</script>What can we conclude about vColor vs color?
Which are Gazintas and which are Comzatas?
38
Gasket4 fragment shader
<script id="fragment-shader" type="x-shader/x-fragment">
precision mediump float;
varying vec4 color;
void
main()
{
gl_FragColor = color;
}
</script> Which are Gazintas and which are Comzatas?
39
Variable Scope
attribute: per-vertex information send from CPU:
coordinates, texture coords, normals, color, …
Predefined and user defined
varying: interpolated data that a vertex shader sends to the fragment shaders
40
On one page…
<script id="vertex-shader" type="x-shader/x-vertex">
attribute vec3 vColor;
varying vec4 color;
color = vec4(vColor, 1.0);
</script><script id="fragment-shader" type="x-shader/x-fragment">
varying vec4 color;
gl_FragColor = color;
</script>Which are Gazintas and which are Comzatas?
41
Pipeline
Gasket4 JavaScript
% diff gasket2.js gasket4.js
5a6
> var colors = [];
7c8
< var NumTimesToSubdivide = 5;
---
> var NumTimesToSubdivide = 3;
20c23
< // First, initialize corners of our gasket with three points.
< vec2( -1, -1 ),
< vec2( 0, 1 ),
< vec2( 1, -1 )
---
> // First, initialize the vertices of our 3D gasket
> vec3( 0.0000, 0.0000, -1.0000 ),
> vec3( 0.0000, 0.9428, 0.3333 ),
> vec3( -0.8165, -0.4714, 0.3333 ),
> vec3( 0.8165, -0.4714, 0.3333 )
We start with some diffs
43
Gasket4 JavaScript
26,28c28,30
< divideTriangle( vertices[0], vertices[1], vertices[2],
< NumTimesToSubdivide);
---
> divideTetra( vertices[0], vertices[1], vertices[2], vertices[3], NumTimesToSubdivide);
...
45d57
< // Associate out shader variables with our data buffer
< gl.vertexAttribPointer( vPos, 2, gl.FLOAT, false, 0, 0 );
---
> gl.vertexAttribPointer( vPos, 3, gl.FLOAT, false, 0, 0 );
44
Hidden Line Removal
34a37,38
> gl.enable(gl.DEPTH_TEST);
...
78c115
< gl.clear( gl.COLOR_BUFFER_BIT );
---
> gl.clear( gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
We talked about Back Face Culling last weekWe may see that again tonightHowever, this uses something else, called the Z-Buffer
45
Gasket4
var vBufferId = gl.createBuffer();
gl.bindBuffer( gl.ARRAY_BUFFER, vBufferId );
gl.bufferData( gl.ARRAY_BUFFER, flatten(points), gl.STATIC_DRAW );
var vPos = gl.getAttribLocation( program, "vPosition" );
gl.vertexAttribPointer( vPos, 3, gl.FLOAT, false, 0, 0 );
gl.enableVertexAttribArray( vPos );
We saw this last time…
46
Gasket4var points = [];
var colors = [];
...
// Create a buffer object, initialize it, and associate it with the
// associated attribute variable in our vertex shader
var cBufferId = gl.createBuffer();
gl.bindBuffer( gl.ARRAY_BUFFER, cBufferId );
gl.bufferData( gl.ARRAY_BUFFER, flatten(colors), gl.STATIC_DRAW );
var vColor = gl.getAttribLocation( program, "vColor" );
gl.vertexAttribPointer( vColor, 3, gl.FLOAT, false, 0, 0 );
gl.enableVertexAttribArray( vColor );
var vBufferId = gl.createBuffer();
gl.bindBuffer( gl.ARRAY_BUFFER, vBufferId );
gl.bufferData( gl.ARRAY_BUFFER, flatten(points), gl.STATIC_DRAW );
var vPos = gl.getAttribLocation( program, "vPosition" );
gl.vertexAttribPointer( vPos, 3, gl.FLOAT, false, 0, 0 );
gl.enableVertexAttribArray( vPos );
What is this?
47
JavaScript
function triangle( a, b, c, color )
{
var baseColors = [
vec3(1.0, 0.0, 0.0),
vec3(0.0, 1.0, 0.0),
vec3(0.0, 0.0, 1.0),
vec3(0.0, 0.0, 0.0)
];
colors.push( baseColors[color] );
points.push( a );
colors.push( baseColors[color] );
points.push( b );
colors.push( baseColors[color] );
points.push( c );
}
So in this scope, color is in index in domain [0-3]
48
JavaScript
function triangle( a, b, c, color )
{
var baseColors = [
vec3(1.0, 0.0, 0.0),
vec3(0.0, 1.0, 0.0),
vec3(0.0, 0.0, 1.0),
vec3(0.0, 0.0, 0.0)
];
colors.push( baseColors[color] );
points.push( a );
colors.push( baseColors[color] );
points.push( b );
colors.push( baseColors[color] );
points.push( c );
}
Why do we push same color three times?
49
JavaScript
var baseColors = [
vec3(1.0, 0.0, 0.0),
vec3(0.0, 1.0, 0.0),
vec3(0.0, 0.0, 1.0),
vec3(0.0, 0.0, 0.0)
];
...
function tetra( a, b, c, d )
{
triangle( a, c, b, 0 );
triangle( a, c, d, 1 );
triangle( a, b, d, 2 );
triangle( b, c, d, 3 );
}
So in this scope, color is in index in domain [0-3]
50
JavaScript
function divideTetra( a, b, c, d, count )
{
if ( count === 0 ) {
tetra( a, b, c, d );
}
else {
var ab = mix( a, b, 0.5 );
var ac = mix( a, c, 0.5 );
var ad = mix( a, d, 0.5 );
...
--count;
divideTetra( a, ab, ac, ad, count );
divideTetra( ab, b, bc, bd, count );
divideTetra( ac, bc, c, cd, count );
divideTetra( ad, bd, cd, d, count );
}
}
51
Shader Outline
What is a GPU?
Why perform GPU processing?
Challenges of Parallel Processing
Basics of GPU
Vertex Shader
Fragment Shader
Programming Paradigm
Examples
To understand Gazintas and Comzataswe dig a bit into how shaders work
What is a GPU?
Raster Graphics use a Frame Buffer to hold pixels
We can set the color for a single pixel by writing a value to a fixed location
Frame Buffer == Memory that holds graphical image
A GPU is Specialized silicon for offloading graphics processing from CPU
While there have been GPUs going back at least to the Commodore Amiga, the term currently implies the ability to program the GPU
53
What Languages?
Offline Rendering
RenderMan Shading Language – the first shader language
Houdini and Gelato – modeled on RenderMan
Real-time shaders
ARB Shaders – low level shading
GLSL (OpenGL Shading Language)
Cg – NVIDIA
DirectX HLSL (High Level Shader Language)
54
NVIDIA G70
55
Parallelism
Computer Scientists have been looking for a way to use multiple CPUs to speed up calculations
Obstacles include
Sequential nature of many computations
Fighting over shared data
a[i] = a[i-1] + i;
There have been isolated areas of success
Numerical Simulations
SQL
Graphics
Shading polygons is another place for parallelism
I can transform v1 and v2 independently
I can scan line y = 2 without affecting y = 3
I can also scan y = 3, x < 3
without affecting y = 3, x > 3
Both write to frame buffer, but update different spots.
Two models of parallelism
SIMD
Single Instruction, Multiple Data
One code path, multiple processors working in parallel
Data Driven streams
Sea of data washes over bed of computational units
Data module has all relevant information
When a complete data unit meets a free computational unit, the data is transformed
Flows down stream for the next operation
58
Graphics Languages
Unlike CPU, GPU architecture details hidden
OpenGL or DirectX provide a state machine that represents the rendering pipeline.
Early GPU programs used properties of the state machine to “program” the GPU.
Tools like Renderman RSL provided sophisticated shader languages, but these were not part of the rendering pipeline.
59
Prior Art
One “programmed” in OpenGL using state variables like blend functions, depth tests and stencil tests
glClearColor(0.0, 0.0, 0.0, 0.0);
glClearStencil(0);
glStencilMask(1); // Can only write LSBit
glEnable(GL_STENCIL_TEST);
glClear(GL_COLOR_BUFFER_BIT | GL_STENCIL_BUFFER_BIT);
glStencilFunc(GL_ALWAYS, 1, 1);
glStencilOp(GL_KEEP, GL_KEEP, GL_REPLACE);
60
Programmable Shaders, v1
As rendering pipeline became more complex, new functionality was added to the state machine via extensions.
The introduction of vertex and fragment programs provided full programmability to the pipeline.
With fragment programs, one could write general programs to process each fragment
MUL tmp, fragment.texcoord[0], size.x;
FLR intg, tmp;
FRC frac, tmp;
SUB frac_1, frac, 1.0;
Writing (pseudo)-assembly code is clumsy and error-prone.
61
GLSL
Current GPU languages, such as Cg and GLSL allow programmer to work in something resembling C
uniform float time; /* in milliseconds */
void main(){
float s;
vec4 t = gl_Vertex;
t.y = 0.1*sin(0.001*time+ 5.0*gl_Vertex.x) *sin(0.001*time+5.0*gl_Vertex.z);
gl_Position = gl_ModelViewProjectionMatrix * t;
gl_FrontColor = gl_Color;
}
62
Model
All the language models share basic properties:
1. They view the pipeline as an array of “pixel computers”, with the same program running at each computer
There are two types of computers:
Vertex computers and Fragment computers
1. Data is streamed to each pixel computer
2. The pixel programs have limited state.
Issues – communication between components
Between CPU and GPU
Between Vertex and Fragment Shaders
Between different Vertex (Fragment) Shaders
63
Programmable Pipeline Elements
We will define vertex shader and fragment shader programs
There are predefined:
variables, holding position, color, etc
OpenGL state, such as
matrices: Model and View transformation
Order of operations is implicit
We can define new variables and functions
We need to declare the scope of these variables
64
Communication
For CPU to communicate to Shaders
Can used predefined attributes
Can define uniform variables
Can use textures (called samplers)
65
Variable Scope
const: compile time constant
uniform: parameter is fixed over a glBegin/glEnd pair
attribute: per-vertex information send from CPU:
coordinates, texture coords, normals, color, …
Predefined and user defined
varying: interpolated data that a vertex shader sends to the fragment shaders
66
Gasket4 vertex shader
<script id="vertex-shader" type="x-shader/x-vertex">
attribute vec3 vPosition;
attribute vec3 vColor;
varying vec4 color;
void
main()
{
gl_Position = vec4(vPosition, 1.0);
color = vec4(vColor, 1.0);
}
</script>What can we conclude about vColor vs color?
67
Datatypes
Singletons -
float, bool, int – no bitmapsfloat a,b;
int c = 2;
bool d = true;
Vectors
vec{2, 3, 4} – vector of floatsvec4 eyePosition = gl_ModelViewMatrix * gl_Vertex;
vec2 a = vec2(1.0,2.0);
vec2 b = vec2(3.0,4.0);
vec4 c = vec4(a,b) // c = vec4(1.0, 2.0, 3.0, 4.0);
bvec{2, 3, 4} – boolean vector
ivec{2, 3, 4} – integer vector
68
Vertex builtins
Per Vertix attributes
in int gl_VertexID;
in int gl_InstanceID;
out gl_PerVertex {
vec4 gl_Position;
float gl_PointSize;
float gl_ClipDistance[];
};
Global Attributes
in vec4 gl_Color;
in vec4 gl_SecondaryColor;
in vec3 gl_Normal;
in vec4 gl_Vertex;
in vec4 gl_MultiTexCoord0;…
in vec4 gl_MultiTexCoord7;
in float gl_FogCoord;
Fragment Shader
uniform float time;
void main()
{
float d = length(gl_FragCoord.xy);
gl_FragColor.r = 0.5*(1.0+sin(0.001*time))*gl_FragCoord.x/d;
gl_FragColor.g = 0.5*(1.0+cos(0.001*time))*gl_FragCoord.y/d;
gl_FragColor.b = gl_FragCoord.z;
gl_FragColor.a = 1.0;
}
(From GLSL Spec) The built-in special variables that are accessible from a fragment shader are intrinsically declared as follows:
in vec4 gl_FragCoord;
in bool gl_FrontFacing;
in float gl_ClipDistance[];
out vec4 gl_FragColor; // deprecated
out vec4 gl_FragData[gl_MaxDrawBuffers]; // deprecated
out float gl_FragDepth;
70
Recap
Program loads vertex and fragment shaders
Shaders take standard attributes and any program specific additions, and compute standard results and additional variables
Last week we used a standard attribute gl_Position to communicate between vertex and fragment shader.
This week we introduced a varying term, color
varying: interpolated data that a vertex shader sends to the fragment shaders
71
Variable Scope
const: compile time constant
uniform: parameter is fixed over a transaction
attribute: per-vertex information send from CPU:
coordinates, texture coords, normals, color, …
Predefined and user defined
varying: interpolated data that a vertex shader sends to the fragment shaders
72
Dot Product
€
C = A − B
C ⋅C = (A − B) ⋅(A − B)
C ⋅C = A ⋅A + B ⋅B − 2A ⋅B
| C |2=| A |2 + | B |2 −2A ⋅B
€
| C |2=| A |2 + | B |2 −2 | A | ⋅| B | cosθ
€
A ⋅B =| A | ⋅| B | cosθ
A ⋅B| A | ⋅| B |
= cosθ
But the Law of Cosines tells us that
Putting these together, we get
Math Corner
73
Vector MathematicsDot product – measures length and angle
We can tease out one term
A•B = |A| |B| cos(theta)
cos(theata) = A•B/|A||B|
Perpendicular vectors have a dot product of 0
Given vector v1 = (a, b)
Build perpendicular vector v2 = (-b, a)
v1 . v2 = -ab + ba = 0, so they are perpendicular
Could also use (b, -a) = -v2
Given vector v = (a, b, c) we have many perpendiculars
(-b, a, 0), (0, -c, b), (-c, 0, a) and combinations of these.
74
Applications
If AB == 0 and |A| =/= 0 and |B| =/= 0, then A and B are perpendicular
We can define a line through the origin as the set of points that are the endpoints of vectors perpendicular to a fixed vector v
The vector is called the normal vector to the line
v = (a, b), then the line is ax + by = 0
All (x, y) such that (x, y)•(a, b) = ax + by = 0
We can translate the line from the origin to get - ax + by = c
Alternative to y = mx + b
A normal vector can also be used to define a plane in 3 D
The normal also defines a "front side"
Wolfram Mathworld
75
Applications
Backface culling – We use the fact that AB < 0 means that the angle is > 90
When deciding if a triangle is facing the camera or is facing away, we compute the dot product of the normal vector to the polygon with the vector from the camera to one of the polygon's vertices.
Gives fast test to see if we need to draw the triangle: those that are facing away are culled (removed)
76
Applications
Back to our video game: The Black Knight
Collision detection
Does a line intersect a circle?
Line, not line segment
Is a circle just the rim or a full disk?
What do you need to answer the question?
77
ApplicationsWe want to know the distance between a point x and a line L
Take a point y on the line L, and draw a vector A from to y to x
Project the vector A onto the vector B giving us a new vector P
First we compute the length of projection P by elementary trig
€
| P |=| A | cos(θ )xL
78
ApplicationsWe can rewrite the length of P using our formula for cos(theta)
€
| P |=| A | cos(θ ) =| A | (A ⋅B)
| A || B |=
(A ⋅B)
| B |
xL
Convert to a vector: the direction of P is the same as B
€
P =(A ⋅B)
| B |
B
| B |= (A ⋅B)
B
| B |2
79
Applications
We want to know the distance between a point x and a line L
Take a point y on the line L, and draw a vector A from to y to x
Project the vector A onto the vector B
Distance from x to the line is length of perpendicular A-P
€
P = (A ⋅B)B
| B |2xL
(Easy) Exercise: show
€
B ⋅(A − P)= 0
80
OpenGL Primitives
GL_QUAD_STRIPGL_QUAD_STRIP
GL_POLYGONGL_POLYGON
GL_TRIANGLE_STRIPGL_TRIANGLE_STRIP GL_TRIANGLE_FANGL_TRIANGLE_FAN
GL_POINTSGL_POINTS
GL_LINESGL_LINES
GL_LINE_LOOPGL_LINE_LOOP
GL_LINE_STRIPGL_LINE_STRIP
GL_TRIANGLESGL_TRIANGLES
OpenGL Corner
WebGL, supports some of these: POINTS, LINES, TRIANGLES
81
Polygon Issues
OpenGL will only display polygons correctly that are
Simple: edges cannot cross
Convex: All points on line segment between two points in a polygon are also in the polygon
Flat: all vertices are in the same plane
User program can check to see if the above true
OpenGL will produce output if these conditions are violated but it may not be what is desired
Triangles satisfy all conditions: we can always use triangles
nonsimple polygon nonconvex polygon
82
Attributes
Attributes are part of the OpenGL state and determine the appearance of objects
Color (points, lines, polygons)
Size and width (points, lines)
Stipple pattern (lines, polygons)
Polygon mode
Display as filled: solid color or stipple pattern
Display edges
Display vertices
83
RGB color
Each color component is stored separately in the frame buffer
Usually 8 bits per component in buffer: 24 bit color
8 bits can store numbers from 0 to 255 as Unsigned Bytes
Note in glColor3f the color values range from 0.0 (none) to 1.0 (all), whereas in glColor3ub the values range from 0 to 255
84
Indexed ColorAn alternative is Index Color
Colors are indices into tables of RGB values (see next slide for GIF scheme)
Requires less memory for color
Indices usually 8 bits
Not as important now
Memory inexpensive
Need more colors for shading
85
GIF Color MapGIF, introduced by CompuServe, uses the idea of indirection
Rather than store the value, store a reference to the valueWe have an image with 24-bit colors for each pixelTo save storage, store index into a small table of colors (color map)Rather than store 24 bits, we typically pick 256 colors and store 8 bits
Lossy if original has more than 256 colorsWorks better for the Simpsons than the Sopranos
Store the color map first, then the array of references to map
0x00ED9D 0x00ED9D 0x00ED9D 0x00ED9D 0x00ED9D
0x00ED9D 0x00ED9D 0xFF69F0 0xFF69F0 0xFF69F0
0x00ED9D 0xFF69F0 0x00ED9D 0x00ED9D 0x00ED9D
0xFF69F0 0x00ED9D 0x00ED9D 0x00ED9D 0x00ED9D
0xFF69F0 0x00ED9D 0xFF0000 0x00ED9D 0x00ED9D
0xFF69F0 0x00ED9D 0x00ED9D 0x00ED9D 0x00ED9D
0xFF69F0 0x00ED9D 0x00ED9D 0x00ED9D 0x00ED9D
0x00ED9D
0xFF69F0
0xFF0000
0 0 0 0 0
0 0 2 2 2
0 2 0 0 0
2 0 0 0 0
2 0 1 0 0
2 0 0 0 0
2 0 0 0 00xFF69F0 0x00ED9D 0x00ED9D 0xFF0000 0xFF0000 2 0 0 1 1
Original Image Compressed Image
86
Coordinate Systems
The units for vertices are determined by the application and are called object or problem coordinates
The viewing specifications are also in object coordinates and it is the size of the viewing volume that determines what will appear in the image
Internally, OpenGL will convert to camera (eye) coordinates and later to screen coordinates
OpenGL also uses some internal representations that usually are not visible to the application
87
OpenGL Camera
OpenGL places a camera at the origin in object space pointing in the negative z direction
The default viewing volume is a box centered at the origin with a side of length 2 (+/- 1)
88
Orthographic Viewing
z=0
z=0
In the default orthographic view, points are projected forward along the z axis onto the plane z=0
89
Perspective Viewing
We can also define a perspective view
90
Transformations and Viewing
OpenGL projection is carried out by a projection matrix (transformation)
We can build the projection matrix in the CPU, and apply it in the GPU
var render = function()
{
gl.clear( gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
...
projectionMatrix = perspective(fovy, aspect, near, far);
...
gl.uniformMatrix4fv( projectionMatrixLoc, false,
flatten(projectionMatrix) );
gl.drawArrays( gl.TRIANGLES, 0, NumVertices );
requestAnimFrame(render);
}
91
Vertex Shader<script id="vertex-shader" type="x-shader/x-vertex">
attribute vec4 vPosition;
attribute vec4 vColor;
varying vec4 fColor;
uniform mat4 modelViewMatrix;
uniform mat4 projectionMatrix;
void main()
{
gl_Position = projectionMatrix*modelViewMatrix*vPosition;
fColor = vColor;
}
</script>
92
Summation
We can define a number of different attributes in OpenGL
Color is the simplest to see
We can define a number of different objects:
points, lines, triangles, …
We can move the objects, and move the camera independently
3D is a simple extension of 2D drawing
Currently, our camera uses projection
We will be able to define a perspective camera shortly
We can use Shaders to perform computations on the GPU