Skip to end of metadata
Go to start of metadata

CS 351 - Assignment 9:Two More Things

Elena Kirillova and Titobiloluwa Awe


In this project, we implemented A-buffering, a substitute to z-buffering that allows for the rendering of transparent shapes and z-buffer texture-mapping that allowed us to map from an image onto the surface of a polygon and model surfaces at a finer level of detail. 

Description of Task


To implement a-buffering, we had to come up with a way to add a list of polygons to each pixel in the image and add polygons to this list in our scanline fill and then calculate the color at each pixel using the recursive color function.

Z-buffer Texture Mapping

Z-buffer texture-mapping required us to make several modifications to the current library: a new structure to store the texture object, a fields in other structure to hold texture properties. Also we had to modify our scanline fill so that for every edge of a polygon texture like everything else was updated in right way in depth, in dx and dy.



We created a Plink data structure that had the color, depth, opacity and next fields, where the next signified the next polygon in the list. This structure was added to the polygon data structure as well as the Image data structure and we did this so as to ensure that every pixel within the image and the polygon had a unique plink pointer. Then in the fillscan function, we inserted the polygon's plink pointer into the point in the image we were in and this populated the linked list. When we were done with this, we called a function called cascade_polygons that took in an image, looped through it and calculated the color at each point in the image using a recursive coloring function:

Z-buffer Texture Mapping

In order to make texture mapping work, we had to go through a number of steps. A number of changes were made to files of polygon, module, drawstate and scanline.

First we created function to handle the texture itself. Texture_create read an image and store its data in a filed of texture structure. Texture_value would take in a boundaries of certain area on the texture and average the colors of all the pixels within it into one color.

Same way as we set normals and colors, we created a new function to set texture coordinates for a polygon and then updated the rest of the polygon functions to handle right that new field. We made texture to be a new object in our modeling system so that our drawstate would have a pointer to the texture when we get to scanline.

In scanline we made for every edge fields like curS, curT, dsPerScan, dtPerScan to handle changing of x and y coordinates of texture for every row.

The final step brought all the things together, where using edge's texture fields we calculated a bounding box of a pixel on the drawn image in terms of texture coordinates, in other words, the area of texture pixels withing a pixel of the image. Then we called Texture_value that average those pixels to give out one color for a pixel to set.


A-buffer test

An Image with transparent blue green and red squares with an opaque cube:

Transparent Cubes

Here the Cubes are all transparent with an opacity of 0.5:

Texture Mapping test

Texture mapped onto three polygons placed differently in the scene.

Planetary with Mapped Center

For this image we modified our Cube function to take texture as a parameter.

So, as one can see, the center cube has a Mandelbrot set texture mapped on all of its sides.


This was a very interesting project to do and although implementing it took a lot of debugging and print statements, it was worth it at the end.