# Computer Laboratory

Computer Graphics and Image Processing

The Computer Graphics and Image Processing course webpage contains links to the syllabus, past tripos questions and the lecture notes.

## First Supervision (from course website and the course book)

1. Read the second chapter of the course book (Fundamentals of Computer Graphics)
2. Calculate the ultimate monitor resolution (i.e. colour pixels/inch) beyond which better resolution will be unnecessary.
3. What are the ray parameters of the intersection between ray (1,1,1)+t(-1,-1,-1) and the sphere centered at the origin with radius 1.
4. The moon is poorly approximated by diffuse or Phong shading. What observations tell you that this is true.
5. Why do most highlights on plastic objects look white, while those on gold metal look gold?

## First Supervision (practical work)

1. Raytrace multiple spheres (and optionally some cubes and cylinders). More details on the exercise can be found in Introduction and simple rendering section of course website, you do not need to send me the code for the exercise, but only a couple of screenshots.

## Second Supervision (from course website and book)

1. Read the chapters five and six of the course book (Fundamentals of Computer Graphics)
2. Give as many reasons as possible why we use matrices to represent transformations. Explain why we use homogeneous co-ordinates.
3. Coordinate Systems. Draw pictures to show what is meant by:
• object coordinates
• world coordinates
• viewing coordinates
• screen coordinates
4. Derive the conditions necessary for two Bézier curves to join with:
• just C0-continuity
• C1-continuity;
• C2-continuity.
What would be difficult about getting three Bézier curves to join in sequence with C2-continuity at the two joins?.
5. Compare and contrast:
• texture mapping
• bump mapping
• displacement mapping
6. Rotations:
• Show how to perform 2D rotation around an arbitrary point.
• Show how to perform 3D rotation around an arbitrary axis parallel to the x-axis.
• Show how to perform 3D rotation around an arbitrary axis (in abstract no need to derive maths).
7. Describe a complete algorithm to do 3D polygon scan conversion, including details of clipping, projection, and the underlying 2D polygon scan conversion algorithm.

## Third Supervision (from course website and book)

1. Read chapters 10 and 12 from the course book
2. We use a lot of triangles to approximate stuff in computer graphics. Why are they good? Why are they bad? Can you think of any alternatives?
3. BSP Tree. Break down the following (2D!) lines into a BSP-tree, splitting them if necessary:
• (0,0)-(2,2)
• (3,4)-(1, 3)
• (1,0)-(-3,1)
• (0,3)-(3,3)
• (2,0)-(2,1)
4. We often use triangles to represent a sphere. Describe two methods of generating triangles from a sphere.
5. 3D Clipping.
• Compare the two methods of doing 3D clipping in terms of efficiency.
• How would using bounding volumes improve the efficiency of these methods?

## Third Supervision (practical work)

1. Setup For the second supervision you will write create a scene using OpenGL. For this exerciew we will use WebGL and THREE.js (so you will need to do a minimal bit of javascript coding). Modern browsers are WegGL-compatible, and some libraries exist which makes things easier for you.

THREE.js abstracts away some of the headaches of WebGL, while still giving you an idea of how the pipeline works. Some tutorials are available here. For debugging Look at javascript console of your browser. Will help if you use something like notepad++ or sublime text for editing the text file rather than notepad.

You will find all you in a .zip file (courtesy of Erroll Wood) that contains:
• wegbl_basic.html – this file contains the GLSL code for the vertex shader, fragment shader, and Javascript code for setting up the scene. Modify the code in this file.
• three.min.js – this library abstracts away lots of the headaches of WebGL, while still giving you an idea of how the pipeline works.
• OrbitControls.js – this library provides mouse controls for rotating the camera.
• The file you should modify is wegbl.html.
2. Basic geometry and manipulation
• Create a denser sphere. Look at the documentation here.
• Add two more spheres and position them at (-5, 0, 0) and (5, 0, -5).
3. Shading This is still looking a bit boring, let's spice things up by adding some shading by editing the fragment and vertex shaders (You might find this helpful).
• Edit the fragment shader to colour the sphere using it's normal, not realistic but a good way to debug things. To convert from vec3 to vec4, use vec4(object, 1.0), the fragment shader expects colour to be specified as vec4 - r,g,b,a.
• You will observe that the normal changes as the sphere rotates. Create a version of normal that is in the viewport, you might find the viewMatrix useful (to convert from vec4 to vec3 use vec3(vec)). Do the same for the light location (this will come in handy later)
• Now we will start to shade the objects properly. Start by setting the object to an ambient colour of vec4(0.2,0.2,0.2,1.0);
• Add some diffuse shading (for example blue (0.0, 0.0, 1.0, 1.0)), the functions that you might find helpful for this are normalize(vec) and dot(vec1, vec2), max(val1, val2)
• Add specular lighting, you will find these functions useful, reflect(l,n) and pow(val, deg)
4. Texturing So far the spheres were coloured using a solid colour let's add some texture.
• First create a texture. By adding the following two lines in the code:

``` THREE.ImageUtils.crossOrigin = ''; var earth_texture = THREE.ImageUtils.loadTexture("http://i.imgur.com/K5Figdih.jpg"); ```
• We want to access this texture, to do this add the following uniform variable to the shaderMaterial:

``` earth_texture: { type: "t", value: earth_texture } ```

Finally, add this uniform variable to the fragment shader:

`uniform sampler2D earth_texture;`
• You will also need to index into the uv coordinates of the texture, to do so add this line to both of the shaders:

`varying vec2 vUv;`

and the following line to the main method of the vertex shader:

`vUv = uv;`

Now we're finally ready for the actual colours, to access colour in texture file use (in fragment shader) use:

`vec3 t_col = texture2D(earth_texture, vUv).rgb;`
• Replace the ambient, specular and diffuse object colours with the texture colours.
5. Surface detail
• Now we can use the same logic to add a normal map to better simulate reflectance properties. You can find a bump/normal map texture corresponding to previous texture here: "http://i.imgur.com/Xrgn2FUh.jpg", this stores the xyz directions of the normal in a rgb values of the image. Be careful as this stores the normals for the planar surface, you need to adapt them for the sphere.

Hint - the direction of normals is encoded to lie from 0-1 as that is how images work, you will need to perform appropriate corrections to get to the image directions. You might want to have a look at actual pixel values to see how this information is encoded.
• Finally lets add a specular map found here: "http://i.imgur.com/XVFZ1jKh.jpg". 0 means no specular reflection 1 is full reflection. You will find the length method useful for determining the vector length from the texture.
6. Optional extensions:
• Add a displacement map (http://i.imgur.com/BnNLKFKh.jpg) which actually extrudes the texture of earth as opposed to just changing the reflection properties (this will have to be done in vertex shader) extrusion should be done in the direction of the normal.
• Example of the camera circling is already there, add the light circling as well. This can simulate the sun.
• Add some other spheres with different textures and reflectance properties to simulate a small solar system. This will require some restructuring of your code.

## Fourth Supervision (from course website and book)

1. Read chapters 21 and 22 from the course book
2. Compare and contrast the use of LCDs and electrophoretic displays for screens in portable devices.
3. Colour Spaces. Explain the use of each of the following colour spaces:
• RGB
• XYZ
• HLS
• Luv
4. Explain the difference between additive colour (RGB) and subtractive colour (CMY). Where is each used and why is it used there?
5. Compare the two methods of Error Diffusion described in the notes, with the aid of a sample image.
6. Select one of (make sure your selection is different from your supervision partner)
• electrophoretic display
• DMD display
• LCD display.
Find out how it works and write a 500 word description which a 12 year old could understand.
• © 2015 Computer Laboratory, University of Cambridge
Information provided by Tadas Baltrusaitis