You really need to download the latest copy of GameDev.
Grab a new copy of ComponentFramework24.0.4 Vulkan.zip as well
Create a two light system. This will require you to modify the shaders and recompile the shaders. You will also need to modify
the UBO to create a cameraUBO and a lighsUBO on both the CPU and GPU to carry the lighing data.
In order to do this task, you'll need change some of my code to something way better.
Let's start this off by messing with CreateUniformBuffers(). Check out the following code bits
Part 1:
In class I showed you how how to build the CreateUniformBuffers as a templated function. I'm sure you can do it.
Another function we should template is UpdateUniformBuffer(...)
Part 2:
Now that you have data in a UBO(s), the trick is to get it into the pipeline. This requires dealing with descriptors.
I have dicussed them in class but now you'll have to really deal with them. Give it some thought, it might become obvious.
I have rearranged OnCreate() in the VulkanRenderer.cpp file.
All that is left is to update the shaders to handle multiple lights and place any lighting data in Lighting UBO. I told it was easy.
Extra: Create ambient, difffuse[], and spectular[] colours for the new light system.
In this project I want you to use a push constant to shoot over Mario's model matrix to the pipeline (shader).
Push are a slightly different way of passing data the shader. In fact, it is even faster than using uniform buffers. But, they do have a drawback, they are restricted
to a maximum of 128 bytes of memory. Can you do a projection, view, and model maxtrix in 128 bytes? Quick answer, no. So the camera will remain a UBO.
Here's a link to a good description of the push constant in case I don't make any sense.
Push Constants
Extra: It is stupid to calculate the normal matrix on the fly in the vertex shader with the transpose(inverse(of the modelmatrix). What? Calculate the normal matrix
for every vertex in the model that's madness. Calculate the normal matrix (as a 4x4) on the CPU side and pass it in via the push constant, along with the normal matrix.
Extra: If you know too much, the normal matrix is just a rotation therefore really just a 3x3 matrix. However, alighnment rules need to be enforced.
See std430 and std120 for details. It's just for you to show off.
Create a two different objects on the screen in the GPU and draw them.
This means storing the data in separate vertex and index buffers. Look how we did the first one.
Do you need to create multiple pipelines? No. Don't forget the descriptor sets, new ones? Let's chat.
You'll need to update the push constant for each model since that is it's model matrix. That just means you'll
need to write those changes into the command buffer. That will be interesting what you come up with.
Now to make this look good would need to make multiple textures as well. Given all I've shown you over that past several
weeks, you should be able to modify Create2DTextureImage() to handle multiple textures. Look at texture2D.sampler in Vulkan.h(.cpp).
Don't forget the descriptor sets, new ones? Let's chat. One of our garduates, Kyle, has posted a how-to at:
Multiple Textures
Kyle wanted to pass the image index though the push constant. Ok, that's cool but our push constant is full so I showed you
how a could make some room in that structure.
The next trick is to populate the normalMatrix vectors. Again, I did this in class.
In this project I hope to show you how to work the tesillation shaders, the tessillation controler and the tessillation evaluator.
From the "Helpful tools panel, download and replace Shaders.h and Shaders.cpp files in your project with these new ones.
Draw a plane on the screen. Then tessallate it (just draw it in wire mode). Use the Camera to get a better view. Then add the Phong
code to the eval and frag shaders. Be sure to use all three textures to make the scene look good.
Extra: Can you control the level of tessallation from the keyboard?
Extra: Can you get LOD working?
The task is to visualize the normals on the surface of a 3D object. They should look like little hairs extending off the surface.
I'll demo it in class.
The geometry shader exists between the vertex shader and the fragment shader. In OpenGL class we never had time to investigate it. Now in Vulkan
let's give it a look.
The website I showed you in OpenGL learnopengl.com is a fair starting point.
As I mentioned in class a year ago, the writing and explanations are great the coding examples suck.
I'll help:
Extra: Can you control the color of the normals from the CPU side? (Pass the color in).
Extra: Can you control the length of the normals from the CPU side?
Now the task is to visualize the mesh as lines on the surface of a 3D object of an already drawn object or you could
make a object explode using the rigid body physics (Umer) and pass in time.
Ideas can be found on the website I showed you in OpenGL learnopengl.com is a fair starting point.
Extra: Can you offset the mesh line away from the surface?
Extra: Can you use the keyboard to control these effects?