Outline for Today
Wiki
Link to the UR Courses wiki page for this meeting
Media
Transcript
Audio Transcript
-
Happy Wednesday.
-
So I had Sorry, do you have a
question?
-
I was confused because previous
assignments had two codes like
-
two separate. Like samples are
like two separate codes and this
-
time it's just one so I was
like, making sure like if it's
-
just one or like I'm making some
mistake or not.
-
Now I was thinking about making
it easier. Thank you.
-
Sara, you You said earlier that
there might be a chance that you
-
can give us a day or two and
assignment number three by you
-
were posting it because we have
a project assignments coming in
-
this week, as well. So, you
consider
-
Yes. You mean in your mobile
computing class do you have a
-
project and assign Yes, project.
-
Presentation is not not today,
but actually
-
the final exams the projects of
the presentation, okay.
-
Morning
-
Monday.
-
27th Right. I think that's the
right day. Monday is the 27th.
-
Yes, okay. Thank you.
-
Okay, does that sound like a
plan?
-
Morning
-
Day
-
screen down
-
so, one Examples from the text
is the
-
I don't remember what he called
it. I don't think it's cube
-
elfa. Pardon me, I said it's a
deconstructed cube. Yes. So the
-
issue is
-
so we talked about translucent
surfaces the other day.
-
So the idea was to draw opaque
polygons with a depth buffer. So
-
we're going to enable depth
testing. We're going to write
-
the depth buffer for opaque
polygons. And then we're going
-
to use the depth buffer
information, but not write to
-
it. Read Only to render the
translucent polygons and we're
-
going to do them in order for
this to nearest.
-
So I deconstructed the cube, as
Karen says so that I could you
-
can easily order to Polygons
from back to front.
-
And then I'm every time I'm
rendering, move them around a
-
bit so you can see some
different effects.
-
So now the yellow and the cyan
are opaque.
-
So what does that look like?
code
-
so I'm passing in. So I have
vertex attribute for the
-
position and for the color and I
have a model view and a
-
projection matrix. So I'm
determining position by
-
multiplying the input vertex
position by the modelview matrix
-
and the projection matrix
-
and then for a fragment shader
just passing through the vertex
-
color
-
and saying the fragment color
-
so here are the coordinates of
the of the faces. So they go
-
from minus point five minus
point five to plus point five
-
plus point five. So you're
squares around the origin and,
-
and then I'm saying this Zed
value to be different. So I'm
-
starting out, this is the
furthest 16543 to one
-
and here I'm saying the face
colors so I can, you can see
-
that blue and green. have alpha
point five and red and magenta
-
have alpha point five.
-
And then I'm saying the
projection so I'm going near
-
zero far is 10.
-
Left and right bottom and top.
And then I'm specifying the
-
viewing transfermate viewing
make viewing parameters so the
-
eye point is at minus one and
Zed looking at the origin with
-
the y positive y axis is up.
-
So here we're generating the
data. Setting the viewport
-
enabling depth test
-
your we have the color buffer
and the vertex buffer. And then
-
we're setting up the location
for the projection and the view
-
model view matrix made those
matrices so when render
-
we clear this the color and the
depth set the projection matrix
-
so we turn on depth mask true
that means we're writing to the
-
depth
-
and we disable blending
-
I'm not sure that's strictly
necessary but
-
I had
-
better safe than sorry. I think
it's good practice to do that.
-
So then, here I have the indices
of the opaque polygons. So I'm
-
going through the list and then
I'm creating the modelview
-
matrix. I started by applying a
random rotation then a random
-
translation and then I multiply
that by the look the viewing
-
transformation so that gives the
model view transformation
-
and so I'm drawing each face
separately
-
and this is giving the offset
into the coordinate array and
-
redrawing six vertices, so two
triangles with each. So, other
-
words a square with each call to
draw arrays. And then when we
-
finish that then we go and draw
the translucent spheres. Sphere
-
squares pardon me. So we turn
off writing to the depth buffer.
-
We're still using it. We're just
not adding information to it.
-
Enable blend. And here's a blend
function
-
so the source is the fragment
shader and the destination is
-
the frame buffer. So that's what
we're blending
-
so we're doing the same kind of
setup where we're doing a random
-
rotation, then a random
translation. Then multiplying by
-
two look at the viewing
transformation so.
-
So we're drawing
-
each track each square
separately from the back to the
-
front.
-
So then I'm waiting five
seconds, and then we're starting
-
again. And here's just the code
to put the vertex data together
-
for the triangles. In the
squares.
-
That makes sense. So the thing
that caused me a lot of grief
-
this time
-
Yeah.
-
Was the text
-
US
-
undefined
-
because if we don't
-
I'm not gonna mess with it.
-
So that didn't come up as an
error
-
at runtime, it did. No, really
not even at runtime
-
so with the depth mask, not so I
also by extension I wrote in
-
gmail dot true so having those
two undefined values, messed it
-
up. And that was the problem. So
I have to report that to the
-
textbook authors.
-
Anyway, so I'll have I'll have
this version published. And I'll
-
give you a link to it
-
Okay, so the other one I wanted
to show you is.
-
So what we're doing here is
creating a texture in an off
-
screen buffer and we have a
lower resolution version here.
-
So we can adjust that
-
example the text was incomplete
because it talked about setting
-
up a render buffer but it didn't
use it. So I've used it here. So
-
this is the way when we do the
assignment to buffers. In our
-
earlier examples where we say
bind, create a buffer and bind
-
it for vertices and colors. If
we want to do that for an off
-
screen frame buffer object, we
create a render buffer and then
-
we attach the render buffer to
the frame buffer. And what we
-
need here is something to start
depth because we it's hard to
-
tell me but there are two
triangles one poking through the
-
other one.
-
Okay so let's look at
-
so vertex shader one is just
take the position as passed.
-
Setting the position variable in
vertex shader to is taking the
-
position but we're also using
texture coordinates so fragment
-
shader one
-
I've changed it from the
original to accept a color as it
-
is a uniform color and so we're
setting that to be the fragment
-
color so that's it that's
fragment shader one and fragment
-
shader two is
-
assigned the fragment color
based on accessing the texture
-
map. So, fragment shader one are
the vertex shader one and
-
fragment shader one are creating
the texture fragment vertex
-
shader two and fragment shader
two are using the texture
-
okay so
-
so we're setting we're only
using one texture register so we
-
make that set that to be the
active texture create an empty
-
texture
-
and restoring RGB, a RGB and
alpha and save bytes. So that's
-
what we're and we're defining it
to be a text size
-
now we're setting some
parameters for the texture. So
-
we allocate a frame buffer
object so this is an off screen.
-
Pleased to render the texture so
we bind the frame so we're going
-
to after we've created the
buffer frame buffer, we're going
-
to use it and then we attach the
color buffer and we're going
-
and for the color we're using
our texture, so we're going to
-
first rate into that texture and
then we're going to use it
-
later. So then this is a way to
get the depth buffer set up. So
-
we create a render buffer. We
bind a render buffer to it to
-
the desk buffer we just created
-
and then
-
are setting up the storage for
it. So we're making it the same
-
size of the texture
-
and here we're attaching
-
so here earlier we did the color
frame buffer texture here. Here
-
we're doing frame buffer render
buffer. So we're attaching to
-
the frame buffer, a dense
attachment
-
so we're saying this is going to
be used as the depth buffer
-
so because we're doing the depth
buffer, we're setting up the
-
format of the data so we're
storing data so we're setting up
-
the storage to be
-
for the depths, so this is the
way to do that. That constant
-
so I mean you have more
distributed names here. So use
-
triangle text, create triangle
textures, and use triangle
-
textures. So using create using
the first term to create the
-
texture enable dev test
-
so here we're setting color for
the first triangle to be red
-
ish. And then the second
triangle we're setting up to be
-
greenish.
-
So we're jamming the texture and
then we're switching from the
-
frame buffer object to the
screen
-
set the texture and then we
render and we're drying the
-
background square with the
texture.
-
See if we can generate a higher
resolution
-
hello
-
so we can generate a very high
resolution texture. So when it
-
when we sat lit with the second
fragment shader
-
we have an excess of information
here so the result is
-
a very very sharp image
-
let's go back to a lower
resolution version.
-
Does that make sense?
-
So we need to specify
-
we have to dress for everything
right
-
that means
-
geography camera.
-
Staring at stuff starting that
would be
-
so with one are staring so we're
staring at arrays of numbers is
-
not conducive. To.
-
Smelt up front
-
all right laserLine
-
that's not a word
-
maybe there's not two genes
anyway
-
so
-
I know looking at so they have a
teapot. model in the chapter
-
nine code there's some errors in
it.
-
It's staring at the numbers not
conclusive.
-
conduce
-
in brackets as
-
spelling
-
let me just let me check my
spelling. Here. It is great. Oh
-
okay
-
she's added that word there
-
looks worse
-
okay
-
so we want to be able to so we
think about the strategy for
-
modeling something then we want
to think about we want to
-
include a cube we don't
-
necessarily want to deal with
just the cube vertices and then
-
make that up. So it's better to
have a cube object. For example
-
and then so I want to go through
the
-
Okay, so we've defined two
different cubes and we've
-
positioned them and are doing
the rotation
-
Okay, so let's look at how
that's defined
-
so you're redoing your rotation
in the vertex shader.
-
So it's just a matter of
rotation
-
so we've lived the
transformation transformations
-
to position the Q's in the in
the JavaScript code. So let's
-
look at that now.
-
So I started adding a third
cube. So we'll try to complete
-
that in a minute
-
so we have
-
the color buffer vertex buffer
event listeners then further
-
so we're just drawing all the
triangles here
-
right so let's look at how the
cube is defined.
-
So we're setting up the vertices
based on the size. And if it's
-
not, if it's not specified we
set the size to be point five.
-
Otherwise, if size is specified
we said to be half the size. So
-
the idea is we get a if we don't
specify a size we get a unit
-
cube centered around the origin.
And if we do specify a size then
-
we're specifying the size of the
face so we divide by two so we
-
get a cube around the origin
-
and we get the face normals and
indices. So this is how we're
-
getting
-
specifying how to use the
vertices and creating a face
-
vertex colors. So these are all
opaque
-
so this is how we're doing.
We're specifying the coordinates
-
of the
-
so you can see them how they
correspond here to so the first
-
cube that first face is 1032 and
103321. So those are the way
-
we're specifying the triangles
for the cube faces.
-
Here are the texture
coordinates.
-
So what we're doing here is
applying the modeling
-
transformations directly to the
to the cube vertices so
-
so that's why we don't have to.
-
Again, in my elfa example
earlier, we did the modeling
-
transformations separately from
the square the squares that are
-
being drawn. So here it would be
if we use this approach then we
-
could specify the square and the
cornice should be changed to
-
reflect the modeling
transformations you've applied
-
Okay, so that's so we're getting
the data set up for the for the
-
cube this way
-
so let's add our third
-
cube
-
aside from your lab to two
cubes, let's see if we can
-
See if I can
-
Too far away
-
okay
-
so let's
-
listen advantage of that. Kind
of approach.
-
So looking at the cube data so
we've separated functionality
-
from our Java main JavaScript
program. So to see it's only
-
105 104 lines long. So it's
pretty straightforward so we've
-
managed to hide some complexity
what are we so the maybe that's
-
an advantage.
-
So if I want to do 1000 cubes in
this approach
-
so I'm getting 1000 calls copies
of the cube data.js data
-
structure so that's not not
efficient
-
so we're also cementing
-
primitives in, in space, so
we're not
-
so right, we're initializing
coordinates there instead of so
-
you if you want to change the
date change the transformations
-
in the model to animate that
might be a problem.
-
So there is.
-
me show you the teapot.
Instance.
-
is So here's the problem that I
can see.
-
Collection No, it's not so bad
-
so I read as I did some reading
it was a teapot and originally
-
it didn't have a bottom. So the
bottom data has been added
-
so the nice thing about
-
so to be more efficient
-
not looking at the right file
-
so instead of copying data and
having that inefficiency, we can
-
draw instance
-
we can draw arrays instance. So
we have one set of data that we
-
can
-
got pretty multiple versions
from the same model. So that
-
that helps us to manage
complexity make the code more
-
readable as well.
-
Okay, so
-
we've talked a little bit about
modeling here. So we'll look at
-
how to deal with the robot arm
and figured out HTML file next
-
next day. And then that's
Monday, then Wednesday and
-
Monday. The next week. Need to
look at the example and current
-
requested some review material
on the course
-
so we have just three more
sessions
-
yeah Time flies when you're
having fun. Anyway, any
-
questions or concerns?
-
I'll see if I can find that
problem with the data okay. So
-
thank you very much for today.
Have a good weekend. And so have
-
the due date for an assignment
three on Monday at one minute to
-
midnight. Okay and good luck
with your your final slash
-
project in your other class
-
Okay, take care everyone.
-
My name
-
is be waiting for somebody to
come up with
-
is be waiting for somebody to
come up with
Responses
What important concept or perspective did you encounter today?
- The most important thing that I learned in today's class was when we started chapter 9 which is all about modelling. In modelling we actually need to tell everything about the scene. We need to define material, geometry, lights ,cameras etc in order to render something that we want on the screen.
- Today the assignment was discussed in the class. The dedline was pushed to monday and code was demonstrated.
Was there anything today that was difficult to understand?
- is to render gemoetry, lights for a multiple canvas at same time.
- Whole modelling was difficult to understand as I was late in the class