Outline for Today
Working with Framebuffers
Administration
Today
For Next Meeting
Wiki
Link to the UR Courses wiki page for this meeting
Media
Transcript
Audio Transcript
-
Well, here we are
meeting 21 You
-
So today, let's Talk about, look
at some examples from chapter
-
eight, Monday, we'll touch on
today chapter nine, and then
-
Wednesday,
-
we'll do so maybe On Monday,
we'll show you some examples of
-
warm, fast rendering methods.
Wednesday, review,
-
I think we can find some
interesting demos to look at as
-
well. That seemed like a
reasonable, reasonable plan. I
-
Okay, I
-
Yes. Okay, so let's look at a
few examples. So I realize the
-
website has some typos in it
that throw up errors when I try
-
and access the online version of
some programs. So I render four
-
and five and some other ones
generate an error, but we can
-
deal with them.
-
We can make global copies and
adjust them so fix the typos. I
-
so let's look at this first
exam.
-
So what do You notice about
this?
-
Yeah, the triangle has jagged
edges.
-
Yeah, so So let's have a look at
the code and
-
so we
-
have two vertex shaders and two
fragment shaders. So the first,
-
the first vertex shader is just
-
setting the position, and the
first fragment shader is just
-
the
-
passing out a constant color. So
this is going to be a red
-
triangle, or a red something.
-
So then the second vertex
shader, in addition to setting
-
the position, passing the
position, we're getting the
-
texture coordinates and so we're
passing those on to the fragment
-
shader.
-
So then we have in the second
fragment shader, the here, input
-
of the texture coordinate,
output of the fragment color,
-
and we have a uniform sampler
texture map. So we're sampling a
-
2d texture map to create the
fragment color. And the way we
-
do that is use texture texture
map, the sampler, and Then the
-
texture coordinates.
-
So here we have our JavaScript.
So we have texture coordinates,
-
2396672
-
so those correspond to the
vertices of the quad that we're
-
defining. And so here's the
triangle vertices and
-
so we're setting up two
programs, one with vertex shader
-
one, fragment shader one and the
other one, fragment shader
-
parting, vertex shader two,
fragment shader two
-
so we create an empty texture.
We make we set texture zero to
-
be active. Bind texture, the
texture one that we're creating
-
here,
-
and we make a 64 by 64 texture,
-
then we don't and we release the
binding of the texture here when
-
we say texture to D null. So
then we're going to allocate a
-
frame buffer object and
-
so we help, so we have a frame
buffer and a render buffer.
-
So we're going to attach to the
frame buffer a color buffer
-
that's in the texture that we've
defined. So we're going to get
-
the colors from the texture
we're it.
-
So this is checking
-
to make sure that the frame
buffer status is complete do
-
the program so then we first use
program One.
-
So that's just do the pass
through to make the red have the
-
red color and
-
so here we're going to render
One triangle. Viewport is 64 by
-
64
-
clear color is gray. We're going
to draw one triangle. We're
-
so we're going to unbind a
texture here from the frame
-
buffer and from the render
buffer,
-
then we're are going to make the
texture. Are going to bind The
-
texture, texture one and
-
it. So, what have we done? This
is a way we can so we have a 64
-
by 64 image
-
that we're creating as a
texture. So where is that image
-
in the on the screen here?
-
So it's not just the triangle,
but it's the background as well.
-
And
-
so to make it more jagged, We
could do 16 by 16.
-
So Now let's see where Is Blue
-
defined And
-
I missed it there. Here's The
blue i
-
So this illustrates that We can
have textures of different
-
resolutions and
-
not a great revelation, Perhaps,
but it's an interesting
-
Use of the frame buffers And
-
so this, This
-
is the one. I referenced in the
quiz I
-
so I've made a local copy so we
can
-
make some changes To I
-
it so I made it more diffuse,
-
And I changed the color a little
bit. I
-
so Here's the HTML, following
the
-
shaders.
-
To this was originally one.
Let's add
-
a two, and let's say But that
-
also To be too I
-
so we have two that gives The
idea. I shouldn't. I should have
-
reversed the parameter changes
-
anyway.
-
So there are two vertex shaders
and two fragment shaders here.
-
So I added color based on the X
the X value of the points. So
-
this is in vertex shader one. If
the point is greater than zero
-
in x, then it's going to be red.
If it's less than or equal to
-
zero in X, the color will be
blue. So that's the first vertex
-
shader,
-
and the first fragment shader is
taking the V color from the
-
vertex shader and the and
-
passing through to the fragment
color. So pretty straightforward
-
stuff. And then in frag, in the
vertex two shader so we're doing
-
the the position, And we're
sitting up, passing through the
-
texture coordinates. We're
-
and then the second fragment
shader is
-
we're going so we set up these
parameters and we get the x and
-
y from the texture coordinates,
and then we're doing a texture
-
map now.
-
With x plus d, y plus d, x minus
d, and Y minus D. So we're
-
sampling other locations. So
we're getting four values for
-
the texture, and then we're
summing them and dividing
-
through by a
-
value of S.
-
So
-
if we're doing
-
four points and so we're just
competing the Average.
-
Okay,
-
so in order for this to work,
the
-
what needs to happen? When do we
call the program one with vertex
-
shader one and fragment shader
one.
-
Yes, Yeah, I
-
Does that make sense? I
-
so what About this one i
-
What's different about it?
-
So we're actively adding points,
so that requires a bit of
-
different approach.
-
So there are two sets of shaders
for this program as well. So in
-
shader one, we're specifying the
position in ly, and we're also
-
Passing the texture coordinates
-
and vertex shader two. We have
the position
-
in 2d and we're also saying the
point size And
-
so fragment shader one
-
have a uniform for the sampler
2d texture map,
-
distance and the scale.
-
So this is
-
another version of what we just
looked at, that's fragment
-
shader one, and fragment shader
two is passing through the the
-
it, no, it's not passing
through. It's it's using the
-
color that's been set in the
uniform. And
-
so
-
we have one that's drawing and
another One that's diffusing.
-
And
-
so we're creating two textures
so
-
and we're binding to the second
texture to begin with. We create
-
those two programs and
-
so we create a frame buffer
object, bind to it and
-
so here we're setting the
locations of the points so
-
they're moving randomly. We're
-
it. So we set up the uniforms,
point size and the color.
-
Then we switch to program one,
-
and we set up the data for the
vertice for the points that
-
we're going to Draw we're
-
then we ask the texture
-
and the parameters, And then we
-
bind the texture and render it.
-
So we're switching between the
two textures we're updating One
-
and then diffusing it into the
other. I
-
So once we've combined the two
buffers, then we detach from the
-
frame buffer, we're going to
render to the screen and
-
so then
-
We adjust the adjust the
particles,
-
and we wrap it around if we need
to, and then we substitute the
-
data that we've already passed,
so we're just updating position
-
of the vertices, and Then we
swap the textures and do it
-
again. We're
-
I have a better version
-
of this,
-
But I'll post to the COVID
directory. I
-
does that give you an idea that
there's lots of capabilities to
-
use within Web GL to that frame
buffers can off screen rendering
-
can be a useful technique in
creating graphics if
-
Okay, so here's the that Those
ideas from a different
-
perspective.
-
So GPUs are very powerful
texture memory. You can have
-
multiple frame buffers and use
floating point numbers as well.
-
So this is the eighth edition of
the textbook, up to version six
-
he was dealing with Open GL. So
that's why he keeps mentioning
-
Open GL and and by God, bygone
logging through yester year
-
seems
-
So there's a
-
so If you're interested, there's
vulcan.org is developing Open
-
Source standard for desktop
graphics. I
-
so texture objects reduce
transfers between CPU and GPU.
-
Transfer picks data back to the
CPU is slow, so we want to
-
manipulate the pixel going back
to the CPU. So we have indeed
-
doing image processing and
general purpose GP, general
-
purpose competing with the GDU.
-
It. So the frame buffer objects
like we saw the get the frame
-
buffer and the render buffer, so
they're not associated with the
-
window system browser, and
they're not they can't display
-
them directly.
-
We can apply with texture or
apply potassium to make them
-
visible so it can attach a
render buffer to a frame buffer
-
object and render off screen
into the attached buffer. Then
-
the attached buffer can be
detached and uses a texture map
-
for an auto Screen render to
default to the default frame
-
buffer so
-
so if we render to the texture
that's attached, then we can get
-
the sense of you can make
incremental changes, so you see
-
the
-
ant colony or Game of Life,
however we want to think about
-
it, so we can visualize those
things going on and
-
so we first create an empty
texture object, then create a
-
frame buffer object, attach the
render buffer for the texture
-
image, bind the frame buffer
object, render the scene, detach
-
the render buffer, bind the
texture and render with new
-
texture if
-
so we have These two textures,
and we're going back and forth
-
between
-
so that's this is setting up the
parameters for texture one, 512,
-
by 512 texture image.
-
So we create the frame buffer
using similar Methodist other
-
objects. So
-
when you create the frame buffer
object, it's empty, and then we
-
have to add our resources render
buffer to render into. We can
-
have a texture which can also be
rendered into. We if we want to
-
do hidden service removal, we
have to Add a depth buffer
-
attachment To the render buffer.
Do
-
so Here's a description of
rendering to texture.
-
So For doing rendering to
texture, we need different
-
programs, which mean different
shaders that we've been like
-
We've been discussing here.
-
So if we're swapping buffers, we
switching them, and we might
-
need some other resources as
well. I
-
so this is the shader that gives
us the red triangle.
-
And here's the
-
here are the shades that do the
texture mapping. I
-
so we set up the buffer with
points, array, data, make the
-
connection to the shader
variable, and then we render the
-
triangle, 64 by 64 then the
second render. So we're using
-
the system frame buffer so we
detach from the frame buffer
-
we'd set up.
-
We turn off the vertex attribute
array, and we're using program
-
two, which is dealing with the
texture.
-
So here we have a texture being
activated, and we're connecting
-
texture one with our as our
texture.
-
So for Buffer two,
-
we're loading the data from the
vertices through flat Then the
-
position
-
make the connection with the
shader And
-
and then we Have the texture
coordinates and
-
So does that make sense? I
-
so this is the example of
diffusion. So two types of
-
agents, quote, unquote. So it's
not interacting with
-
environment, and they are
indicated by the different
-
colors. They move randomly,
-
so we have the position
information specified by
-
rendered with rendering the
texture, and then we're
-
diffusing by going back and
forth and diffusing the image,
-
so we get the sense of the
-
passage of Time And
-
so initially we Draw the points
and
-
let's see.
-
So does that make sense? That
one is drawing the position of
-
the points, and so the points
are being updated each time. So
-
we do a we do a rendering of the
points, and then we diffuse
-
them. We render lines to the
texture, and we diffuse the
-
texture again. And
-
so this is the diffusing
-
vertex shader one, fragment
shader one, and this is drawing
-
the points
-
and setting the color based on
the uniform variable and
-
so we're using program one bind
to a frame buffer. So if flag is
-
true, then we're going to bind
-
texture one and attach texture
two to the frame buffer. So
-
we're going to render into
texture two. Otherwise, you're
-
going to bind texture two and
render into texture one.
-
So we're drawing a strip of
triangles and
-
to render the points we're using
program two we set up the
-
The attributes.
-
So the interesting line here,
maybe is it
-
makes it hard for
-
me. It's not actually, it's four
lines. So here we're setting the
-
uniform color for program two.
So the first one is red and
-
blue, so that's kind of magenta,
and then the other one is almost
-
fully green. So we're drawing
drawing the vertices as points
-
the
-
first half of the vertices are
drawn in magenta and the others
-
are drawn in green. So
-
then the
-
rendering to the display, and
then we swap the textures and
-
so if we use texture two before,
we're using texture one, and if
-
we Use texture one before, then
We're using texture Two. And
-
so here they're doing, reading
the texture to see if the
-
if the green particle sees
-
lands on a green particle, then
it's going to move. And if the
-
magenta particle sees a red
component, then going to move as
-
well to a different spot. So
here we're using green pixels to
-
do that. I'm
-
so we can see
-
some difference in that and
-
so Here's talking about the
picking that we talked About
-
last day,
-
this conditional makes more
sense to me than the one in the
-
code. So if color zero is 255 so
if it's red, then if color one
-
is 255 that means it's yellow.
Otherwise, if color two, there's
-
no if there's no green, but
there's blue, then it's magenta.
-
Otherwise it's red.
-
If color one is a
-
the green is set and blue is set
at cyan. Otherwise it's green.
-
Otherwise, if the blue is set,
then it's colored blue. And if
-
we're not hitting any of them,
then the color is background.
-
And
-
I'll post some modified versions
of the code to the website. Any
-
questions or concerns
-
the Have a good weekend
-
Take care.
-
Take care.
Responses
What important concept or perspective did you encounter today?
- Today, I learned about the use of frame buffer objects in WebGL and how they can be used to help smooth out a ragged image.
- Important concepts from today are, monday we will go over topics from chapter 9 and talk about advanced rendering methods, wednesday we will review, ask questions, and look at interesting demos, we looked at render2, render3, and particleDiffusion code, then we went over slides about the framebuffer, and using multiple vertex and fragment shaders for program1 and program2 in the same code files.
- Assignment 3 discussion and some image formation on the objects and also detail about multi pass rendering
- We discussed the framebuffer and looked at some example of code such as particle diffusion.
- went through rendering and color or particle diffusions code
- mapping
- Frame buffers and how to use them. program with 2 shaders and how these 2 shaders work together ; double buffering
- The concept of the framebuffer: In Render 3, Program 1 renders triangles into the framebuffer, and then Program 2 uses this output to apply a diffusion effect and the concept of how the framebuffer works in the particle diffusion program.
- learn about particle diffusion and check 'render2', 'diffuser' and 'particledissusion' webgl program
Was there anything today that was difficult to understand?
Was there anything today about which you would like to know more?