代写 R html Java javascript math parallel GPU WebGL: Intro 2019/3/17 下午10)53

WebGL: Intro 2019/3/17 下午10)53
(http://cse.msu.edu/~cse472)
WebGL: Intro
This page includes all sections for Step 3 (../step3.php) in a single page. Section: Coloring and Texturing a Square
Download and unzip the project WebGLIntro.zip (WebGLIntro.zip) into some local directory so you can work on it. It uses some common packages under the “Common” folder and a couple of texture images under the “Material” folder.
The folders 0-8 are there for your convenience, 1-8 contains the final .html and .js after stages 1-8 in the tutorial respectively.
Work directly on the .html and .js files in the root folder (parent folder of 0-8).
When you open Step3.html in a browser, you’ll see a green square. The html file contains simple vertex and fragment shaders. The .js contains the initialization of a square ,4 vertices and 2 triangles, putting them to buffer, and a rendering function, where glDrawElements() does the job of rendering all triangle elemtns in parallel.
The code is fairly straightforward, as it doesn’t do much.
We can use GPU to do interpolation for us and change it into a smoothly colored square. To do that, we first need to add a “varying” variable in BOTH fragment and vertex shaders, anywhere in front of main(). (Recall that they are both in the html file as mentioned above.) It will serve as an output of vertex shader and as an input of the fragment shader
Next, we initialize it in the vertex shader.
Finally, we use it in the fragment shader for color by replacing the existing line with
Make sure that it works. Then, let’s turn it into a textured square.
The process is essentially identical to what we did before: First, create a texture by adding the following to init(), somewhere after rectangle()
Define the following boiler plate functions.
varying vec2 fUV;
fUV = vPosition.xy;
gl_FragColor = vec4(fUV, 0., 1.);
initTexture();
https://facweb.cse.msu.edu/ytong/cse472/step3/stepall.php?step=3 第 1 ⻚页(共 16 ⻚页)

WebGL: Intro 2019/3/17 下午10)53
Next, we use the texture in fragment shader by introducing a uniform sampelr variable, somewhere before main(). It tells us which texture in GPU memory we are using.
And change the line for gl_Frag Color into
Finally, we need to bind squareTexture to TEXTURE0, and link uSampler to 0th texture in render(), before calling glDrawElements().
If you open Step3.html as a local file, your browser may show a black square, since some browsers consider the image as of cross domain origin. You may use FireFox, or set up your local host for the WebGL folder. You can also put your WebGL folder under your cse account folder “web” to access it on your personal website. Here is how, if you don’t know it already. (http://www.cse.msu.edu/Facility/Howto/CreateWebpage.php) BTW, debugging tools can be activated on Chrome/FireFox by Ctrl+Shift+I, and F12 on IE.
You can try other texture images at this point. The .html and .js should look like the ones under folder “1”.
Section: Rotations and Projection
uniform sampler2D uSampler;
gl_FragColor = texture2D(uSampler, fUV);
gl.activeTexture(gl.TEXTURE0);
gl.bindTexture(gl.TEXTURE_2D, squareTexture); gl.uniform1i(gl.getUniformLocation(program, “uSampler”), 0);
var squareTexture;
function initTexture() {
squareTexture = gl.createTexture();
var squareImage = new Image();
squareImage.onload = function () { handleTextureLoaded(squareImage, squareTexture);
}
squareImage.src = “Material/HelloWorld.png”;
}
function handleTextureLoaded(image, texture) {
gl.bindTexture(gl.TEXTURE_2D, texture);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, image); gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR); gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR_MIPMAP_NEAREST); gl.generateMipmap(gl.TEXTURE_2D);
gl.bindTexture(gl.TEXTURE_2D, null);
}
https://facweb.cse.msu.edu/ytong/cse472/step3/stepall.php?step=3 第 2 ⻚页(共 16 ⻚页)

WebGL: Intro 2019/3/17 下午10)53
Now we rotate the square in 3D. We first add three buttons in the html by adding the following code right before “”
Then we declare a uniform variable before main() in vertex shader, which stores three Euler angles.
Next, we change main() of vertex shader to handle the rotations represented as three Euler angles stored in theta




uniform vec3 theta;
https://facweb.cse.msu.edu/ytong/cse472/step3/stepall.php?step=3 第 3 ⻚页(共 16 ⻚页)

WebGL: Intro 2019/3/17 下午10)53
Now we swtich to the .js file, and add variables and code to update the angles and handle the button events. First, we insert the following variables before the init() function
Then, we insert the code that get the location of theta in the shader (already declared as a uniform input variable) and the event handlers in init() before it calls render().
var axis = 0;
var xAxis = 0;
var yAxis = 1;
var zAxis = 2;
var theta = [0, 0, 0];
//event listeners for buttons
document.getElementById(“xButton”).onclick = function () { axis = xAxis;
};
document.getElementById(“yButton”).onclick = function () {
axis = yAxis; };
document.getElementById(“zButton”).onclick = function () { axis = zAxis;
};
Finally, we update the angle based on the current choice inside render() before clearing the buffer.
// Compute the sines and cosines of theta for each of
// vec3 vec3 vec3
the three axes in one computation. angles = radians( theta );
c = cos( angles );
s = sin( angles );
// Remeber: thse matrices are column-major mat4 rx = mat4( 1.0, 0.0, 0.0, 0.0,
0.0, c.x, s.x, 0.0,
0.0, -s.x, c.x, 0.0,
0.0, 0.0, 0.0, 1.0 );
mat4 ry = mat4( c.y, 0.0, -s.y, 0.0, 0.0, 1.0, 0.0, 0.0,
s.y, 0.0, c.y, 0.0,
0.0, 0.0, 0.0, 1.0 );
mat4 rz = mat4( c.z, s.z, 0.0, 0.0, -s.z, c.z, 0.0, 0.0,
0.0, 0.0, 1.0, 0.0,
0.0, 0.0, 0.0, 1.0 );
gl_Position = rz * ry * rx * vPosition;
https://facweb.cse.msu.edu/ytong/cse472/step3/stepall.php?step=3 第 4 ⻚页(共 16 ⻚页)

WebGL: Intro 2019/3/17 下午10)53
Now you can try it in a browser again: click the buttons to change the rotation axis.
We can translate the camera and do a proper perspective projection by changing the code for calculation of gl_Position to the following.
float near = 0.1;
float far = 100.;
mat4 mProjection = mat4(1., 0., 0., 0.,
0., 1., 0., 0.,
0., 0., (near+far)/(near-far), -1.,
0., 0., 2.* near *far/(near -far), 0.);
mat4 translation = mat4(1., 0., 0., 0., 0., 1., 0., 0.,
0., 0., 1., 0.,
0., 0., -3., 1.);
gl_Position = mProjection * translation* rz * ry * rx * vPosition;
Reload the html, it’ll move the square further away. The code is now the same as the files in folder “2”.
Section: From Square to Torus
Now we change the square to a rectangle and wrap the rectangle into a torus, while keeping the texture coordinates as the original 2D coordinates.
theta[axis] += 2.0; gl.uniform3fv(gl.getUniformLocation(program, “theta”), theta);
https://facweb.cse.msu.edu/ytong/cse472/step3/stepall.php?step=3 第 5 ⻚页(共 16 ⻚页)

WebGL: Intro 2019/3/17 下午10)53
We start by making the sampling density higher by adding more points and triangles: replace the code in rectangle() before nFaces = indices.length by
https://facweb.cse.msu.edu/ytong/cse472/step3/stepall.php?step=3 第 6 ⻚页(共 16 ⻚页)

WebGL: Intro 2019/3/17 下午10)53
We can see that the position of vertices are no longer good texture coordinates. So we introduce a separate global variable to store the UVs.
We make sure that we have 10 copies of the texture along the horizontal direction and 2 copies along vertical direction, by adding texture coordinates after points.push().
To use them, we first add an input to the vertex shader
Now we change the code for setting fUV in vertex shader to
To link UVs to vertex shader, we add the following at the end of rectange()
var UVs= [];
UVs.push(phi / Math.PI * 5., psi / Math.PI);
attribute vec2 vUV;
fUV = vUV;
function rectangle() {
j +
1]);
r = R =
var var for
} }
for
(i = 0; i < xRes; i++) { for (j = 0; j < yRes; j++) { indices.push([i * (yRes+1) + j, (i + 1) * (yRes+1) + j, (i + 1) * (yRes+1) + 0.2; 1.; = 40; = 40; 0; i <= xRes; i++) { xRes yRes (i = for (j = 0; j <= yRes; j++) { phi = Math.PI * (i *2./xRes - 1.); psi = Math.PI * (j *2./yRes - 1.); points.push(vec3(phi * R, psi * r, 0)); indices.push([i * (yRes + 1) + j, (i + 1) * (yRes + 1) + j + 1, i * (yRes + 1) + j + 1]); } } nFaces = indices.length; https://facweb.cse.msu.edu/ytong/cse472/step3/stepall.php?step=3 第 7 ⻚页(共 16 ⻚页) WebGL: Intro 2019/3/17 下午10)53 Try it, and you should see 20 copies of the texture. Now we are ready to wrap it into a cylinder. Replace the line for points.push() to Now the cylinder is sometimes showing the side that should have been occluded. This is because we still need to turn on depth test in render(): replace gl.clear() by This time, it should look normal, and we are ready to turn it into a torus by changing points.push again: At this point, your code should be similar to the files in subfolder "3". Section: Simple Lighting In WebGL, lighting calculation are handled in the shaders. Before we actually perform the calculation, we need one more piece of information: normal. We can add a global variable. Then we push normals where we push pints and UVs: We also create buffer data similar to that of texture coordinates: points.push(vec3(phi * R, Math.cos(psi) * r, Math.sin(psi) * r)); gl.enable(gl.DEPTH_TEST); gl.clear(gl.COLOR_BUFFER_BIT|gl.DEPTH_BUFFER_BIT); points.push(vec3(Math.cos(phi) * (R + Math.cos(psi) * r), Math.sin(phi) * (R + Math.cos(psi) * r), Math.sin(psi) * r)); var normals = []; normals.push(Math.cos(phi) * Math.cos(psi), Math.sin(phi) * Math.cos(psi), Math.sin(psi)); //Create buffer to store the texture coordinates var tcBuffer = gl.createBuffer(); gl.bindBuffer(gl.ARRAY_BUFFER, tcBuffer); gl.bufferData(gl.ARRAY_BUFFER, flatten(UVs), gl.STATIC_DRAW); //Link data to vertex shader input var vUV = gl.getAttribLocation(program, "vUV"); gl.vertexAttribPointer(vUV, 2, gl.FLOAT, false, 0, 0); gl.enableVertexAttribArray(vUV); https://facweb.cse.msu.edu/ytong/cse472/step3/stepall.php?step=3 第 8 ⻚页(共 16 ⻚页) WebGL: Intro 2019/3/17 下午10)53 Now we need to access it in vertex shader as an attribue: To interpolate it, we add varying variable to both shaders: We set fNormal in vertex shader Finally, we use fNormal to calculate shading in fragment shader Now the code should be like those in folder "4". Section: Adding a Cube To render more than one objects, we need more than one call to glDrawElements(), and we need to prepare the data. First, we can add a cube() function, and call it in init() before the call to rectangle(). attribute vec3 vNormal; varying vec3 fNormal; fNormal = (rz * ry * rx * vec4(vNormal,0.)).xyz; vec3 lightDirection = normalize(vec3(1., 2., 0.5)); float shading = max(0., dot(fNormal, lightDirection)); gl_FragColor = vec4(shading *0.7, shading *1., shading *0.7, 1.) *texture2D(uSampler, fUV)+ vec4(0.2, 0.2, 0.2, 0.); //Create buffer to store the normals var nBuffer = gl.createBuffer(); gl.bindBuffer(gl.ARRAY_BUFFER, nBuffer); gl.bufferData(gl.ARRAY_BUFFER, flatten(normals), gl.STATIC_DRAW); //Link data to vertex shader input var vNormal = gl.getAttribLocation(program, "vNormal"); gl.vertexAttribPointer(vNormal, 3, gl.FLOAT, false, 0, 0); gl.enableVertexAttribArray(vNormal); var points2 = []; var normals2 = []; var UVs2 = []; var indices2 = []; var buffers2 = []; function cube() { points2.push( vec3(-0.5, vec3(-0.5, -0.5, 0.5), 0.5, 0.5), vec3(0.5, 0.5, 0.5), https://facweb.cse.msu.edu/ytong/cse472/step3/stepall.php?step=3 第 9 ⻚页(共 16 ⻚页) WebGL: Intro 2019/3/17 下午10)53 vec3(0.5, -0.5, 0.5), vec3(-0.5, -0.5, -0.5), vec3(-0.5, 0.5, -0.5), vec3(0.5, 0.5, -0.5), vec3(0.5, -0.5, -0.5)); //Inaccurate normals just for testing. normals2.push ( normalize(vec3(-0.5, -0.5, 0.5)), normalize(vec3(-0.5, 0.5, 0.5)), normalize(vec3(0.5, 0.5, 0.5)), normalize(vec3(0.5, -0.5, 0.5)), normalize(vec3(-0.5, -0.5, -0.5)), normalize(vec3(-0.5, 0.5, -0.5)), normalize(vec3(0.5, 0.5, -0.5)), normalize(vec3(0.5, -0.5, -0.5))); UVs2.push( vec2(0.,0.), vec2(1.,0.), vec2(1.,1.), vec2(0.,1.), vec2(1.,1.), vec2(0.,1.), vec2(0.,0.), vec2(1.,0.) ); indices2.push( 1, 0, 3, 1, 3, 2, 2, 3, 7, 2, 7, 6, 3, 0, 4, 3, 4, 7, 6, 5, 1, 6, 1, 2, 4, 5, 6, 4, 6, 7, 5, 4, 0, 5, 0, 1 ); var gl.bindBuffer(gl.ARRAY_BUFFER, vBuffer2); gl.bufferData(gl.ARRAY_BUFFER, flatten(points2), gl.STATIC_DRAW); buffers2.push(vBuffer2); var nBuffer2 = gl.createBuffer(); gl.bindBuffer(gl.ARRAY_BUFFER, nBuffer2); gl.bufferData(gl.ARRAY_BUFFER, flatten(normals2), gl.STATIC_DRAW); buffers2.push(nBuffer2); var tcBuffer2 = gl.createBuffer(); gl.bindBuffer(gl.ARRAY_BUFFER, tcBuffer2); gl.bufferData(gl.ARRAY_BUFFER, flatten(UVs2), gl.STATIC_DRAW); buffers2.push(tcBuffer2); var tBuffer2 = gl.createBuffer(); vBuffer2 = gl.createBuffer(); https://facweb.cse.msu.edu/ytong/cse472/step3/stepall.php?step=3 第 10 ⻚页(共 16 ⻚页) WebGL: Intro 2019/3/17 下午10)53 No cube is showing up yet. Replace glDrawElements() in render() by the following to link the buffers for cube to see the cube: //Link data to vertex shader input var vPosition = gl.getAttribLocation(program, "vPosition"); var vNormal = gl.getAttribLocation(program, "vNormal"); var vUV = gl.getAttribLocation(program, "vUV"); //Draw Cube gl.bindBuffer(gl.ARRAY_BUFFER, buffers2[0]); gl.vertexAttribPointer(vPosition, 3, gl.FLOAT, false, 0, 0); gl.enableVertexAttribArray(vPosition); gl.bindBuffer(gl.ARRAY_BUFFER, buffers2[1]); gl.vertexAttribPointer(vNormal, 3, gl.FLOAT, false, 0, 0); gl.enableVertexAttribArray(vNormal); gl.bindBuffer(gl.ARRAY_BUFFER, buffers2[2]); gl.vertexAttribPointer(vUV, 2, gl.FLOAT, false, 0, 0); gl.enableVertexAttribArray(vUV); gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER,buffers2[3]); gl.drawElements(gl.TRIANGLES, 12 * 3, gl.UNSIGNED_SHORT, 0); Now, let's put the torus back. We first need to modify the torus code in rectangle() by changing the buffer creation part to the following (after defining global variable, "var buffers=[ ];"): gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, tBuffer2); gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, new Uint16Array(flatten(indices2)), gl.STATIC_DRAW); buffers2.push(tBuffer2); } https://facweb.cse.msu.edu/ytong/cse472/step3/stepall.php?step=3 第 11 ⻚页(共 16 ⻚页) WebGL: Intro 2019/3/17 下午10)53 Then we add similar code in render() after the rendering of cube: //Draw Torus gl.bindBuffer(gl.ARRAY_BUFFER, buffers[0]); gl.vertexAttribPointer(vPosition, 3, gl.FLOAT, false, 0, 0); gl.enableVertexAttribArray(vPosition); gl.bindBuffer(gl.ARRAY_BUFFER, buffers[1]); gl.vertexAttribPointer(vNormal, 3, gl.FLOAT, false, 0, 0); gl.enableVertexAttribArray(vNormal); gl.bindBuffer(gl.ARRAY_BUFFER, buffers[2]); gl.vertexAttribPointer(vUV, 2, gl.FLOAT, false, 0, 0); gl.enableVertexAttribArray(vUV); gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, buffers[3]); gl.drawElements(gl.TRIANGLES, nFaces * 3, gl.UNSIGNED_SHORT, 0); Now both objects will show up, and you code should be similar to those in "5". Section: Rendering Options //Create buffer to store the vertex coordinates var vBuffer = gl.createBuffer(); gl.bindBuffer(gl.ARRAY_BUFFER, vBuffer); gl.bufferData(gl.ARRAY_BUFFER, flatten(points), gl.STATIC_DRAW); buffers.push(vBuffer); //Create buffer to store the normals var nBuffer = gl.createBuffer(); gl.bindBuffer(gl.ARRAY_BUFFER, nBuffer); gl.bufferData(gl.ARRAY_BUFFER, flatten(normals), gl.STATIC_DRAW); buffers.push(nBuffer); //Create buffer to store the texture coordinates var tcBuffer = gl.createBuffer(); gl.bindBuffer(gl.ARRAY_BUFFER, tcBuffer); gl.bufferData(gl.ARRAY_BUFFER, flatten(UVs), gl.STATIC_DRAW); buffers.push(tcBuffer); //Create buffer to store the triangle elements var tBuffer = gl.createBuffer(); gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, tBuffer); gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, new Uint16Array(flatten(indices)), gl.STATIC_DRAW); buffers.push(tBuffer); https://facweb.cse.msu.edu/ytong/cse472/step3/stepall.php?step=3 第 12 ⻚页(共 16 ⻚页) WebGL: Intro 2019/3/17 下午10)53 We may want different rendering styles for our two objects. This can be done by compiling a different set of shaders. Or we can provide options inside the shader. Let's follow the second approach for now. First, we can create a boolearn indicating whether to use texture in the fragment shader. uniform bool useTexture; void main() { vec3 lightDirection = normalize(vec3(1., 2., 0.5)); float shading = max(0., dot(fNormal, lightDirection)); if (useTexture) gl_FragColor = vec4(shading *0.7, shading *1., shading *0.7, 1.) *texture2D(uSampler, fUV)+ vec4(0.2, 0.2, 0.2, 0.); else gl_FragColor = vec4(fUV, 0., 1.); } It looks funky for the torus, so let's turn texture on when rendering torus in render(). And turn it off for rendering cube. Finally let's make the cube look like the standard color cube by adding a "varying vec4 fColor;" to both shaders. And set it in vertex shader: And change the else part of setting gl_FragColor (in fragment shader) to Now, your code should be similar to those in "6". Section: Simple Animation gl.uniform1i(gl.getUniformLocation(program, "useTexture"), true); gl.drawElements(gl.TRIANGLES, nFaces * 3, gl.UNSIGNED_SHORT, 0); gl.uniform1i(gl.getUniformLocation(program, "useTexture"), false); gl.drawElements(gl.TRIANGLES, 12 * 3, gl.UNSIGNED_SHORT, 0); fColor = vPosition+vec4(0.5,0.5,0.5,0.); else gl_FragColor = fColor; https://facweb.cse.msu.edu/ytong/cse472/step3/stepall.php?step=3 第 13 ⻚页(共 16 ⻚页) WebGL: Intro 2019/3/17 下午10)53 We will try some simple physical simulation by adding gravitational acceleration to the objects. First, we add a varialbe to store the vertical translation in the vertex shader (before main()). Then, we change the translation matrix to In the javascript, we add the global variables (before init()) In the rendering function, we addd the following code before clearing the buffer uniform float displacement_y; mat4 translation = mat4(1., 0., 0., 0., 0., 1., 0., 0., 0., 0., 1., 0., 0., displacement_y, -3., 1.); var displacement_y = 2.; var velocity_y = 0.; velocity_y = 0.9999*velocity_y - 0.1; displacement_y = displacement_y + velocity_y * 0.03; if (displacement_y < -2.) { displacement_y = -2.; velocity_y = -velocity_y; } gl.uniform1f(gl.getUniformLocation(program, "displacement_y"), displacement_y); https://facweb.cse.msu.edu/ytong/cse472/step3/stepall.php?step=3 第 14 ⻚页(共 16 ⻚页) WebGL: Intro 2019/3/17 下午10)53 We assume that the velocity_y is updated by the graviational acceleration along -y direction, it is also damped a bit (e.g., drag of the air). Then the displacement is updated by the velocity. If the cube hit the floor at y=-2, it is bounced back. For simplicity we are not dealing with the rotation and collision properly. Now the code is similar to those in "7". Section: Keybaord Event Handler We will explore simple keyboard event handling. First, we create the event handler in the .js file, where we change the rotation angle around the Y-axis according to the key pressed. (37 is the left arrow, 39 right, 38 up, and 40 down) function OnKeyDown(event) { if (event.keyCode == 37) { theta[1] -= 30.0; } if (event.keyCode == 39) { theta[1] += 30.0; } } Then, we link it to the key pressed events, as we did for the buttone events, in init(). Now you can try out the left and right arrow keys, and the code is similar to those in "8". Task: UI document.onkeydown = OnKeyDown; https://facweb.cse.msu.edu/ytong/cse472/step3/stepall.php?step=3 第 15 ⻚页(共 16 ⻚页) WebGL: Intro 2019/3/17 下午10)53 Add a button to pause and unpause the rotation. Hint: it is similar to the rotation buttons. Handle key 'p' (keycode is 80) to pause and unpause the translation. Handle the up and down arrow keys to control the rotation around X-axis (theta[0]). Task: Rendering Add specular component to the reflected color. Modify your fragment shader, using for example the code snippet from http://shdr.bkcore.com/ (http://shdr.bkcore.com/) vec2 blinnPhongDir(vec3 lightDir, float lightInt, float Ka, float Kd, float Ks, float shininess) { vec3 s = vec3 v = vec3 n = vec3 h = float diffuse = Ka + Kd * lightInt * max(0.0, dot(n, s)); float spec = Ks * pow(max(0.0, dot(n,h)), shininess); return vec2(diffuse, spec); } normalize(lightDir); normalize(-fPosition); normalize(fNormal); normalize(v+s); Hint: fPosition can be a variable of a similar type as fNormal (output of vertex shader and input of fragment shader), but its calculation in the vertex shader should be like gl_Position except that no projection is needed. Your shader has a conditional branching. There is no need to modify the code for rendering the standard color cube. Task: Adding an object Add an equilateral tetrahedron or an equilateral square pyramid of a reasonable size at a reasonable location to the scene. Follow the example when we added the cube. Use a different texture than the torus. Any texture coordinates should be fine, as long as each triangle does not have zero area in the texture image. Hint: When you define a function that can contruct the object, you'll need to call that function to actually construct the object. Task: Submission When you complete the assignment, download and complete the document Step 3 Grading (step3grade.doc) and include it in your project. Then proceed to the page on How to submit projects (http://www.cse.msu.edu/~cse472/resources/handinvisualstudio.htm). Note that you only need to submit the html and js files (along with the self-grade doc) in a zip file since there are no solution/project files in this case. (http://cse.msu.edu/~cse472) CSE 472 Home Page (http://cse.msu.edu/~cse472) https://facweb.cse.msu.edu/ytong/cse472/step3/stepall.php?step=3 第 16 ⻚页(共 16 ⻚页)