Mtg 18/23: Mon-18-Nov-2024

Outline for Today

Texture Mapping

Administration

Today

For Next Meeting

Wiki

Link to the UR Courses wiki page for this meeting

Media

Transcript

Audio Transcript

  • Good job.
  • I How is everyone today?
  • Anyone watched the gray cup? Anyone pleased with The results
  • you
  • Okay, I
  • Here we are at meeting 18. You
  • i so i anonymize
  • the cues backward to Victor and Sam and I pushed them through
  • our courses. I
  • so there was this disagreement that it was about the right
  • level of difficulty and
  • and you can look at that And I've removed
  • the names associated with The comments. I
  • so there's a question of what
  • we can do
  • to make the final exam a fair assessment of your learning over
  • the semester, so that it matches what we've done in class and
  • and
  • It's not too it's
  • challenging, but not too challenging,
  • And there is sufficient Time to Answer questions you
  • I say starting now, So I'd really like you to engage in the
  • idea of
  • create, taking part in the creation of an exam that is a
  • fair assessment of your work this semester, and I say, please
  • consider suggesting Questions and Answers starting now doesn't
  • mean you should guide your Horses during the class here and
  • start typing away. But if that that works, I
  • wouldn't be too upset. I
  • so I will work out the example and
  • give you a few different ways to approach it and
  • so in that question and exam,
  • we wanted to move the center rotation to a point not at the
  • origin.
  • So we wanted to, we
  • in terms of code, we have, we
  • we Have a matrix, and then we're going to multiply So
  • we're going to specify translation to The new center.
  • Then we apply our rotation matrix and
  • does That make sense? I
  • Is that a good way to get at
  • the process of
  • translating
  • the this happens in a few cases where we want to do a rotation
  • of transformation from some point other than the origin, we
  • translate from the origin to the new center. Apply the
  • transformation, then we translate center to the origin.
  • So the first one here is specify. The first one specifies
  • the last applied. We
  • up so we're doing the translated origin and
  • so this is x, F,
  • is that still Okay? Do
  • so I neglected
  • a cut off date for the assignments after which you
  • wouldn't get a late penalty, but after which you wouldn't get a
  • Mark at All.
  • So You
  • so we have our final exam On the ninth
  • so if you're
  • if you're struggling To get assignments in
  • on the due date, if
  • It has to be In by The 10th And
  • so there was a comment
  • about clarifying what goes on with grow shading and Fauci
  • so
  • let's Take a minute to talk about where the calculations are
  • done for each of them and
  • okay, not everyone all at once. Vertex Can you repeat
  • that approach? Actually, it's done in vertex shader and stuff
  • in the and stuff, yeah, I
  • do we use normal vectors in Grow shading?
  • Yes, we do,
  • but we use them all the end of vertices, so we're COVID and
  • intensity. So
  • we're covering intensity at the vertex, So when we're doing the
  • laying equation and
  • so we perform the lighting calculations of the vertices,
  • and then that gives us a vertex color, and we pass the vertex
  • color to The fragment chair and output the fragment color the
  • fragment shader And interpolates
  • that value
  • to give us our image. So we calculate the intensity at the
  • vertex, and then we interpolate the intensity the fragment
  • shader does the interpolation across The scan line.
  • So long, shading, really fragment shader and
  • so with grow
  • shading, using the vertex shader, we're interpolating the
  • intensity that we compute for the vertices, The faux shading
  • and the fragment shader. Does it interpolate intensities? Do
  • yes,
  • no,
  • okay, so I
  • so with the font shading, we're performing Lane calculations and
  • fragments and
  • so with long shading, we're interpolating normal vectors,
  • and because we're doing the lighting calculation at the
  • fragment level, then we can make a more accurate approximation in
  • our illumination equation,
  • so we get
  • more realistic highlights, regular Highlights and
  • anything else?
  • Does that make sense? I
  • so there's a question or comment about translucency and
  • so
  • that comes up In chapter eight, and we'll
  • do alpha blending if
  • and then two shaders and
  • I found a couple of old examples, but I haven't had a
  • chance
  • to put them into web or
  • into example code.
  • Forward of that, So
  • oh, I would like to Just mention something About the midterm. I
  • so to talk a bit about texture mapping,
  • so We have can map a texture in an image and it doesn't have to
  • be an image, a photograph. It can be a constructed image that
  • we create in our code. Environment. Mapping is another
  • way to add cues about the space we're in, and
  • bump mapping is about modifying normal vectors to add detail
  • that isn't in the model. So We're cheating A bit. I
  • so we're getting extra detail for a little for a very good
  • price. And
  • so this is from 2020, and the figures might be older than
  • that.
  • Anyone know
  • how many millions of polygons or triangles per second for a
  • current graphics card,
  • it's a lot,
  • but we don't want to
  • we don't want to spend all our polygons doing complicated
  • models, or maybe physically accurate models of clouds and
  • drain and skin that one doesn't seem to belong in the list,
  • because If we want to in a game, we're not
  • those. Models aren't aren't the focus of our attention, but they
  • add realism and
  • the ability to the setups.
  • So
  • we do
  • texture mapping to overcome to allow us to keep The polygons
  • low and still have realistic textures, realist,
  • believable. Detail, I guess So, modeling In Orange and
  • so a sphere that's colored orange may be appropriate In
  • some cases, but it's
  • too simple,
  • and if we
  • scan an Orange, a navel orange,
  • or a mandarin, Christmas orange, I
  • not spherical. You don't see fruit at the grocery store.
  • That's sphere.
  • But if we are, we don't
  • want to make the model, the geometry too complex to capture
  • all the features of the orange. Maybe there's some happy medium
  • between getting the basic shape and then adding details with
  • mapping.
  • So we can
  • take a picture of an orange and then map it onto the sphere and
  • it. We can also do bump mapping, so putting the picture of an
  • orange onto the Orange sphere is texture mapping
  • and changing
  • surface characteristics is about now.
  • So we have a geometric model on the left
  • and then a texture mapped version on the right. And
  • then here's an example with environment mapping.
  • So the surface, the surface of the model is reflective, and
  • it's reflecting the environment that we specified for it.
  • And here is a version of bump mapping,
  • so we can see
  • the differences in the
  • surface quality.
  • Here it looks a bit rough. Here it's smoother and
  • it impressive or not.
  • So we're doing mapping at the end of the pipeline,
  • so after we've
  • clipped the polygons out that aren't going to be in the scene
  • and see you.
  • So
  • we can either,
  • we want to go from The texture to the imp to the model we
  • So it's straightforward, but I
  • we have
  • a challenge to Get from our 2d texture to the 3d surface. So
  • so the different coordinate systems, so there's parametric
  • coordinates that specify positions in our models, extra
  • coordinates, used to identify points in the image to be
  • mapped, object or world coordinates, where the mapping
  • takes place, And window coordinates where the final
  • image is produced. The
  • so the idea is the parametric coordinates are are used
  • to describe the object surface that's going to have the texture
  • applied to it. Then we have texture coordinates s and t, and
  • we're going to map the texture coordinates to the parametric
  • coordinates on the surface.
  • So then, as we get three find the points on the surface we're
  • looking to map to then we make the connection with the texture
  • coordinates and sample texture At that point, and we get our
  • final image and
  • so we can think of x, y and z in terms of texture coordinates
  • that it might be easier to do the other the other way. So we
  • have a pixel, and we want to know which point on an object to
  • which point it corresponds on the object.
  • So we specify s and t in terms of x, y and z. So me,
  • so
  • it might be easier to map to an intermediate object and then I
  • and then go from the cylinder to the surface. We're interested in
  • texture mapping and
  • so
  • if we think about A sphere, we have an issue of how to not
  • so putting a square image onto the sphere is not going to be
  • without problems.
  • If you were giving a basketball or a volleyball or water polo
  • ball as a Christmas gift and you want to wrap it.
  • It might look good around the middle, But at the ends, it's a
  • problem. I
  • so instead of doing a sphere, we can use a box.
  • So then it's a matter of
  • putting square images on the different sides of the box and
  • So when we go from our intermediate object to our final
  • One, we
  • need to take care of the normals and
  • it. So here's an example of accessing the texture and
  • Getting some aliasing problems
  • because we're not sampling with high enough frequency to get The
  • blue stripes.
  • So instead of point sampling. We can average the area, but
  • the slower we
  • so if we think about
  • where those dot samples are taken, if we average the area
  • and
  • so we're not just taking one sample for the area That's
  • represented in the image and on the surface
  • If we look at all the values available there, you would
  • certainly catch some blue stripe
  • here and here,
  • this one is pretty well centered, So it might not get
  • much blue at all. You
  • so in Web GL
  • steps to apply texture is for specified. Specify the texture.
  • You can generate an image
  • or read one from a source file, and then we assign the texture
  • and
  • make it available, and we enable texturing.
  • So we need to assign texture coordinates to the vertices and
  • specify texture parameters, whether we're going to wrap or
  • filter it and
  • so the idea is we go from the image at the bottom, bottom left
  • to the geometry of The top left, and then display this is the top
  • The top right.
  • So here we're taking an image and then mapping it on to
  • surface that's viewed in perspective and
  • it. So I
  • think that's pretty straightforward. So it's the
  • idea that we're mapping the image on to the surface. So as
  • the surface changes, the viewing parameters change and the
  • perspective projection is applied, we get to see the image
  • as if it's pasted on to that surface.
  • So we have a geometry pipeline of vertices and texture
  • development pipeline with images that lead into the fragment
  • processor you
  • so
  • we can create A texture, image, indie.
  • So the textbook as an example With a checkerboard image i
  • So here's an example where we're taking a square texture and
  • mapping it onto a triangle. So then it's a matter of specifying
  • the vertices
  • in
  • texture ordinance the triangle vertices. So we can see here
  • that A is mapped point 2.8,
  • B is mapped, point 4.2, and C is point 8.4
  • So we can
  • manipulate
  • the texture mapping that way. And
  • so this checkerboard pattern is a texture that's applied I'm
  • so we're doing the rotation In our vertex shader, and we're
  • getting the color and the texture coordinates And the
  • position as vertex attributes,
  • then we're specifying, so GL position, that's setting the
  • position. And we don't go further with that, but we set V
  • color, a color, so Vertex Color is the attribute color that
  • we've read from our buffer. And V tech chord, V Tex cord is used
  • assigned with the value from a tax cord. So the attribute that
  • we've specified the texture cord that we specified per vertex and
  • so we have
  • V color in, V Tex cord in and out is our fragment color, and
  • We have a uniform sampler for the texture map.
  • So fragment color is the vertex color multiplied By the texture
  • that's returned from that sound here.
  • Sounded like an alarm, and it's very short lived and
  • so we're going to create the image and memory, and we're
  • alternating black and white, and Here
  • we're configuring the texture and we're Pushing texture
  • coordinates
  • with the quad so
  • So
  • the texture coordinates are go from zero to one
  • and X and Y. Our s and t are zero to one and
  • so the order is dependent on how they're being pushed to our data
  • structure, the buffer and
  • so I
  • you can
  • see what happens is if you change the coordinates or the
  • mapping, I
  • How does that sound? Does that make sense? I
  • any questions or concerns, applause,
  • okay, so I'll do so I
  • I'll have some examples discuss on Wednesday. And
  • thank you very much for today and good rest of your life.
  • recipe.
  • Take care everyone.
  • Take care everyone.

Responses

What important concept or perspective did you encounter today?

  • In the last meeting, I learned the concept of Gouraud and Phong shading. Their calculations for Gouraud is done in the vertex shader for the lighting in vertices and intensity. As for Phong, calculations are in fragment shader and deals with lighting and normal vector.
  • Texture Mapping techniques
  • Some important concepts covered in the meeting today were, student feedback about the midterm, assignment 3 and 4 cutoff date is Dec. 10, Gouraud and Phong shading, texture mapping and its coordinate systems, 3 steps to applying a texture, then we looked at the code for texturecube2.
  • Texture mapping and converting to on screen coordinates
  • We discussed how textures get distorted when they are mapped to geometry. This reminds me of how 2D maps of the globe are distorted when we try to map the surface of the globe to a rectangle. And in a similar way, map makers must choose how they want to distort the map. Usually it occurs at the poles.
  • Getting to discuss texture mapping is what I believe to be some of the most fundemental and important concepts to disucss and learn how to properly utilize. Discussing how Bump/Normal Maps are interpreted and utilized to imitate complex geometry.
  • Shading and different types of mapping, Texture, environment and bump and also about midterm and assignment
  • The difference between gouraud and phong shading in relation to where the computation is done, with regards to the fragment and vertex shaders
  • We discussed the Gourand and Phong shading, and Texture mapping.
  • Texture mapping
  • Gouraud v/s Phong Shading, Texture mapping, bump mapping and how they are achieved; forward mapping and backward mapping; mapping images onto objects in depth
  • The concept of Gouraud shading computes lighting in the vertex shader and interpolates vertex intensities across the polygon. Phong shading, calculated in the fragment shader, interpolates normals across the polygon for per-pixel lighting. Texture mapping assigns a 2D texture to a 3D surface and samples it in the fragment shader for rendering.
  • About assignment we asked some questions and answers

Was there anything today that was difficult to understand?

    Was there anything today about which you would like to know more?