This tutorial is part of a Collection: 03. DirectX 11 - Braynzar Soft Tutorials
rate up
1
rate down
18839
views
bookmark
04. Begin Drawing!

Now that we have initialized direct3d, we can start displaying our crazy minds on the computer! Well... maybe not quite, but drawing a simple triangle is a huge step in the right direction, as ALL 3D objects and scenes we will be drawing are made up of them. Here we will draw a simple, solid blue colored triangle. We will discuss the rendering pipeline and get an idea of how shaders work.

2136 downloads
Here we will learn to draw a simple blue triangle! Before we actually get started, we need to cover how direct3d 11 actually works. I try to make lessons as short as possible while still seeing something get done, by taking the shortest meaningful steps possible. I think they're more efficient that way, as you won't get as bored like you would reading through one that takes hours and hours. You'll get more out of shorter steps by not falling asleep. You can probably see this is a bit of a longer one like the previous, initializing direct3d 11, but it's a big step necessary to take the rest. Truthfully though, it's not as complicated as it may look once you read through. There really isn't a whole lot of new code. ##Programmable Graphics Rendering Pipeline## If you've had experience with Direct3D 10, and understand the pipeline, you can pretty much skip these pipeline sections since Direct3D basically is an extension of Direct3D 10, where it uses the same pipeline but with a couple extra stages. Direct3D 11 has added 3 new stages to the programmable graphics pipeline. Not only that, but it also supports another separate, but loosely connected pipeline, called the compute shader pipeline. The three new stages in the graphics rendering pipeline (a.k.a. draw pipeline) are the Hull, Tesselator, and Domain shaders. They have to do with tesselation, which basically adds more detail to an object, a LOT more detail. What it does for example is take a simple triangle from a model, it might add a few more vertices to create a couple more triangles, then reposition the vertices to make the triangle much more detailed. It can take a simple low pollygon model, and turn it into a very detailed, high polly model by the time it's displayed on the screen. It does all of this very fast and efficiently. This is an advanced topic we will not be learning how to implement in this lesson though. The compute shader (a.k.a. Dispatch Pipeline) is used for to do extremely fast computations by expanding the processing power of the CPU by using the GPU as a sort of parallel processor. This does not have to have anything to do with graphics. For example, you could do very performance expensive operations, such as accurate collision detection, on the GPU using the compute shader pipeline. The compute shader will not be discussed in this lesson. The rendering pipeline is the set of steps direct3d uses to create a 2d image based on what the virtual camera sees. It consists of the 7 Stages used in Direct3D 10, along with 3 new stages accompanying Direct3D 11, which are as follows: **1. Input Assembler (IA) Stage 2. Vertex Shader (VS) Stage 3. Hull Shader (HS) Stage 4. Tesselator Shader (TS) Stage 5. Domain Shader (DS) Stage 6. Geometry Shader (GS) Stage 7. Stream Output (SO) Stage 8. Rasterizer (RS) Stage 9. Pixel Shader (PS) Stage 10. Output Merger (OM) Stage** Another thing is we must now compile each shader. This makes sure the shader is error free. Also, instead of setting a technique from the effect file as the active technique (sequence of shaders), we can set the individual shaders whenever we want in code. I believe this is a little more dynamic approach, as it gives us more freedom to change the active shaders, while keeping others as the active shaders. For example, we can change the pixel shader from using lighting calculations to determine the final pixel color to a pixel shader which does not use lighting equations, while still keeping the same vertex shader active. +[http://www.braynzarsoft.net/image/100018][Direct3D Pipeline] The rounded stages are "programmable" stages, where we actually create them ourselves. The square stages are the ones we do not program, but we can change its settings using the direct3d 11 device context. ##Input Assembler (IA) Stage## The first stage you can see is the Input Assembler (IA). The IA is a fixed function stage, which means we do not do the programming to impliment it. The IA reads geometric data, vertices and Indices. Then it uses the data to create geometric primitives like traingles, squares, lines, and points which will be fed into and used by the other stages. Indices define how the primitives should be put together by the vertices. We will discuss indices in a later lesson. Before we send aything to the IA, we need to do a couple things first, Such as create a buffer and set the Primitive Topology, Input Layout, and active buffers. To start, we first create a buffer. The two buffers used by the IA are the vertex and index buffers. In this lesson we will not worry about the index buffer yet. To create the buffer, we will fill out a D3D11_BUFFER_DESC structure. After creating the buffer or buffers, we need to create the input-layout object. What this does is tell direct3d what our vertext structure consists of, and what to do with each component in our vertex structure. We provide the information to direct3d with an array of D3D11_INPUT_ELEMENT_DESC elements. Each element in the D3D11_INPUT_ELEMENT_DESC array describes one element in the vertex structure. If your Vertex structure has a position element, and a color element, then your D3D10_INPUT_ELEMENT_DESC array will have one element for the position and one for the color. Here is an example: //The vertex Structure struct Vertex { D3DXVECTOR3 pos; D3DXCOLOR color; }; //The input-layout description D3D11_INPUT_ELEMENT_DESC layout[] = { {"POSITION", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0, D3D11_INPUT_PER_VERTEX_DATA, 0}, {"COLOR", 0, DXGI_FORMAT_R32G32B32A32_FLOAT, 0, 12, D3D11_INPUT_PER_VERTEX_DATA, 0} }; In this lesson, our vertex structure looks like this: struct Vertex //Overloaded Vertex Structure { Vertex(){} Vertex(float x, float y, float z) : pos(x,y,z){} XMFLOAT3 pos; }; So our input layout description looks like this: D3D11_INPUT_ELEMENT_DESC layout[] = { { "POSITION", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0, D3D11_INPUT_PER_VERTEX_DATA, 0 }, }; After describing the input layout with the D3D11_INPUT_ELEMENT_DESC structure, we need to create it with the function: ID3D11Device::CreateInputLayout() We also need to create a vertex buffer to hold our objects vertices. To create a vertex buffer, first we describe our buffer using a D3D11_BUFFER_DESC structure, and then fill a D3D11_SUBRESOURCE_DATA structure with the actual vertex data. To do the actual creation of the vertex buffer, we can call: ID3D11Device::CreateBuffer() The next step is binding our layout description and vertex buffer to the IA. We can bind the our layout and vertex buffer to the IA by calling the functions: ID3D11DeviceContext::IASetVertexBuffers() ID3D11DeviceContext::IASetInputLayout() Now we need to set the primitive topology so that the IA will know how to use the vertices and make primitives such as triangles or lines or something. We call the function: ID3D11DeviceContext::IASetPrimitiveTopology() I will cover the different types later in this lesson. After our pipeline is ready, we call a draw method to send the primitives to the IA. The method we call in this lessons is: ID3D11DeviceContext::Draw() ##Vertex Shader (VS) Stage## The VS is the first programmable shader, which means we have to program it ourselves. The VS Stage is what ALL the vertices go through after the primitives have been assembled in the AI. Every vertex drawn will be put through the VS. With the VS, you are able to do things like transformation, scaling, lighting, displacement mapping for textures and stuff like that. The Vertex Shader must always be implemented for the pipeline to work, even if the vertices in the program do not need to be modified. The shaders in the pipeline are written in the language HLSL. The language is similar to C++ syntax so it's not hard to learn. I will explain the effects file in each lesson we change it, and later we will have a lesson dedicated to HLSL. For this lesson, Our Vertex shader does nothing, so we just return the position of each vertex without modifying it. It looks like this: float4 VS(float4 inPos : POSITION) : SV_POSITION { return inPos; } The vertex shader takes a single vertex as input, and returns a single input. Notice the POSITION right after the Pos in the VS parameters. When we create our vertex (input) layout, we specify POSITION for the position values of our vertex, so they will be sent to this parameter in the VS. You can change the name from POSITION if you want. ##Hull Shader (HS) Stage## The HS is the first of the three new optional stages added to the direct3d 11 graphics rendering pipeline. The three new stages, Hull Shader Stage, Tessellator Stage, and Domain Shader Stage, all work together to implement something called tesselation. What tesselation does, is take a primitive object, such as a triangle or line, and divide it up into many smaller sections to increase the detail of models, and extremely fast. It creates all these new primitives on the GPU before they are put onto the screen, and they are not saved to memory, so this saves much time in creating them on the CPU and memory. You can take a simple low polly model, and turn it into a very high detailed polly using tesselation. So, back to the Hull Shader. This is another programmable stage. I'm not going to go into detail, but what this stage does is calculate how and where to add new vertices to a primitive to make it more detailed. It then sends this data to the Tessellator Stage and the Domain Shader Stage. ##Tessellator (TS) Stage## The tessellator stage is the second stage in the tessellation process. This is a Fixed Function stage. What this stage does is take the input from the Hull Shader, and actually do the dividing of the primitive. It then passes the data out to the Domain Shader. ##Domain Shader (DS) Stage## This is the third of three stages in the tessellation process. This is a programmable function stage. What this stage does is take the Positions of the new vertices from the Hull Shader Stage, and transform the vertices recieved from the tessallator stage to create the more detail, since just adding more vertices in the center of a triangle or line would not increase the detail in any way. Then it passes the vertices to the geometry shader stage. ##Geometry Shader (GS) Stage## This Shader stage is optional. It's also another Programmable Function Stage. It accepts primitives as input, such as 3 vertices for triangles, 2 for lines, and one for a point. It can also take data from edge-adjacent primitives as input, like an additional 2 vertices for a line, or an additional 3 for a triangle. An advantage to the GS is that it can create or destroy primitives, where the VS cannot (it takes in one vertex, and outputs one). We could turn one point into a quad or a triangle with this stage. We are able to pass data from the GS to the rasterizer stage, and/or though the Stream Output to a vertex buffer in memory. We'll learn more about this shader stage in a later lesson. ##Stream Output (SO) Stage## This Stage is used to obtain Vertex data from the pipeline, specifically the Geometry Shader Stage or the Vertex Shader Stage if there is no GS. Vertex data sent to memory from the SO is put into one or more vertex buffers. Vertex data output from the SO are always sent out as lists, such as line lists or triangle lists. Incomplete primitives are NEVER send out, they are just silently discareded like in the vertex and geometry. Incomplete primitives are primitives such as triangles with only 2 vertices or a line with only one vertex. ##Rasterization Stage (RS) Stage## The RS stage takes the vector information (shapes and primitives) sent to it and turns them into pixels by interpolating per-vertex values across each primitive. It also handles the clipping, which is basically cutting primitives that are outside the view of the screen. We set the viewport in the RS using the following: ID3D11DeviceContext::RSSetViewports() ##Pixel Shader (PS) Stage## This stage does calculations and modifies each pixel that will be seen on the screen, such as lighting on a per pixel base. It is another Programmable function, and an optional stage. The RS invokes the pixel shader once for each pixel in a primitive. Like we said before, the values and attributes of each vertex in a primitive are interpolated accross the entire primitive in the RS. Basically it's like the vertex shader, where the vertex shader has a 1:1 mapping(it takes in one vertex and returns one vertex), the Pixel shader also has a 1:1 mapping(it takes in one pixel and returns one pixel). The job of the pixel shader is to calculate the final color of each pixel fragment. A pixel fragment is each potential pixel that will be drawn to the screen. For example, there is a solid square behind a solid circle. The pixels in the square are pixel fragments and the pixels in the circle are pixel fragments. Each has a chance to be written to the screen, but once it gets to the output merger stage, which decides the final pixel to be drawn to the screen, it will see the depth value of the circle is less than the depth value of the square, so only the pixels from the circle will be drawn. The PS outputs a 4D color value. In this lesson, we use the pixel shader to make the triangle blue by returning a 4D float value equal to the color blue. There is no calculations done, just simply returning the color green, so every pixel that is run through the shader will be blue. float4 PS() : SV_TARGET { return float4(0.0f, 0.0f, 1.0f, 1.0f); } ##Output Merger (OM) Stage## The final Stage in the Pipeline is the Output Merger Stage. Basically this stage takes the pixel fragments and depth/stencil buffers and determines which pixels are actually written to the render target. We set the render target in the last lesson by calling: ID3D11DeviceContext::OMSetRenderTargets() We can talk more about this later. After the OM stage, all thats left to do is to present our backbuffer to the screen. We can do this by calling: IDXGISwapChain::Present(); Here we create a couple new interface objects. Remember, interface objects must be released when your done with them. The first one is the buffer which will hold our vertex data in. The next two are our vertex and pixel shaders. After that we have our vertex and pixel buffers, which will hold the information about our vertex and pixel shaders. The last one is our input (vertex) layout. ID3D11Buffer* triangleVertBuffer; ID3D11VertexShader* VS; ID3D11PixelShader* PS; ID3D10Blob* VS_Buffer; ID3D10Blob* PS_Buffer; ID3D11InputLayout* vertLayout; ##Vertex Structure & Input Layout## ( D3D11_INPUT_ELEMENT_DESC ) All the 3D objects we make will be made of points with attributes, such as color, called vertices. We have to make our own vertex structure. This is an overloaded vertex structure (so we can create and edit a vertex easily and dynamically) with only a position. Notice the XMFLOAT3. As I mentioned in the last lesson, direct3d is moving away from the d3dx math library and into the more popular xna math library. Before we would have used D3DXVECTOR3. After that you can see we have our input layout. An input layout is defined using an array of D3D11_INPUT_ELEMENT_DESC structures. The D3D11_INPUT_ELEMENT_DESC looks like this: typedef struct D3D11_INPUT_ELEMENT_DESC { LPCSTR SemanticName; UINT SemanticIndex; DXGI_FORMAT Format; UINT InputSlot; UINT AlignedByteOffset; D3D11_INPUT_CLASSIFICATION InputSlotClass; UINT InstanceDataStepRate; } D3D11_INPUT_ELEMENT_DESC; Where each member is described below: SemanticName - This is just a string to associate with the element. This string will be used to map the elements in the vertex structure to the elements in the vertex shader. SemanticIndex - This is basically just a number after the semantic name to use as an index. For example, if we have 2 texture elements in the vertex structure, instead of creating 2 different texture semantic names, we can just use 2 different index's. If a semantic name in the vertex shader code has no index after it, it defaults to index 0. For example in our shader code, our semantic name is "POSITION", which is actually the same as "POSITION0". Format - This is just the format of our component in our vertex structure. It needs to be a member of the DXGI_FORMAT enumerated type. In this lesson, we have a 3d vector describing the position, so we can use the DXGI_FORMAT: DXGI_FORMAT_R32G32B32_FLOAT. If you need other formats, you can find them on msdn. Later we will be using other ones. InputSlot - Direct3D allows us to use 16 different element slots (0-15) which you can put vertex data through. If we have our vertex structure has a position and color, we could put both the elements through the same input slot, or we can put the position data through the first slot, and the color data through the second slot. We only need to use one, but you can experiment if you would like. AlignedByteOffset - This is the byte offset of the element you are describing. In a single input slot, if we have position and color, position could be 0 since it starts at the beginning of the vertex structer, and color would need to be the size of our vertex position, which is 12 bytes (remember our format for our vertex position is DXGI_FORMAT_R32G32B32_FLOAT, which is 96 bits, 32 for each component in the position. there are 8 bits in one byte, so 96/8 == 12). InputSlotClass - Right now we can just use D3D10_INPUT_PER_VERTEX_DATA. The other options are used for instancing, which is an advanced technique we will get to later. InstanceDataStepRate - This is also used only for instancing, so we will specify 0 for now After we define the input layout, we create a global variable to hold the size of our input layout array. We do this so later we will not have to remember to keep updating the function that creates the input layout. struct Vertex //Overloaded Vertex Structure { Vertex(){} Vertex(float x, float y, float z) : pos(x,y,z){} XMFLOAT3 pos; }; D3D11_INPUT_ELEMENT_DESC layout[] = { { "POSITION", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0, D3D11_INPUT_PER_VERTEX_DATA, 0 }, }; UINT numElements = ARRAYSIZE(layout); ##Cleaning Up## The next new things are in the CleanUp function, where we release the interfaces when we are done with our program. void CleanUp() { //Release the COM Objects we created SwapChain->Release(); d3d11Device->Release(); d3d11DevCon->Release(); renderTargetView->Release(); triangleVertBuffer->Release(); VS->Release(); PS->Release(); VS_Buffer->Release(); PS_Buffer->Release(); vertLayout->Release(); } ##Initializing the Scene## Here is where we initialize our scene. This is where we put the things that will change throughout the course of our game, but will not change throughout the scene. Pretty much all the new stuff in this lesson is here. I will explain one part at a time. bool InitScene() { //Compile Shaders from shader file hr = D3DX11CompileFromFile(L"Effects.fx", 0, 0, "VS", "vs_5_0", 0, 0, 0, &VS_Buffer, 0, 0); hr = D3DX11CompileFromFile(L"Effects.fx", 0, 0, "PS", "ps_5_0", 0, 0, 0, &PS_Buffer, 0, 0); //Create the Shader Objects hr = d3d11Device->CreateVertexShader(VS_Buffer->GetBufferPointer(), VS_Buffer->GetBufferSize(), NULL, &VS); hr = d3d11Device->CreatePixelShader(PS_Buffer->GetBufferPointer(), PS_Buffer->GetBufferSize(), NULL, &PS); //Set Vertex and Pixel Shaders d3d11DevCon->VSSetShader(VS, 0, 0); d3d11DevCon->PSSetShader(PS, 0, 0); //Create the vertex buffer Vertex v[] = { Vertex( 0.0f, 0.5f, 0.5f ), Vertex( 0.5f, -0.5f, 0.5f ), Vertex( -0.5f, -0.5f, 0.5f ), }; D3D11_BUFFER_DESC vertexBufferDesc; ZeroMemory( &vertexBufferDesc, sizeof(vertexBufferDesc) ); vertexBufferDesc.Usage = D3D11_USAGE_DEFAULT; vertexBufferDesc.ByteWidth = sizeof( Vertex ) * 3; vertexBufferDesc.BindFlags = D3D11_BIND_VERTEX_BUFFER; vertexBufferDesc.CPUAccessFlags = 0; vertexBufferDesc.MiscFlags = 0; D3D11_SUBRESOURCE_DATA vertexBufferData; ZeroMemory( &vertexBufferData, sizeof(vertexBufferData) ); vertexBufferData.pSysMem = v; hr = d3d11Device->CreateBuffer( &vertexBufferDesc, &vertexBufferData, &triangleVertBuffer); //Set the vertex buffer UINT stride = sizeof( Vertex ); UINT offset = 0; d3d11DevCon->IASetVertexBuffers( 0, 1, &triangleVertBuffer, &stride, &offset ); //Create the Input Layout hr = d3d11Device->CreateInputLayout( layout, numElements, VS_Buffer->GetBufferPointer(), VS_Buffer->GetBufferSize(), &vertLayout ); //Set the Input Layout d3d11DevCon->IASetInputLayout( vertLayout ); //Set Primitive Topology d3d11DevCon->IASetPrimitiveTopology( D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST ); //Create the Viewport D3D11_VIEWPORT viewport; ZeroMemory(&viewport, sizeof(D3D11_VIEWPORT)); viewport.TopLeftX = 0; viewport.TopLeftY = 0; viewport.Width = Width; viewport.Height = Height; //Set the Viewport d3d11DevCon->RSSetViewports(1, &viewport); return true; } ##Compiling the Shaders## ( D3DX11CompileFromFile() ) We will start initializing our scene by creating the shaders. We will compile the shaders from an effect file called "Effects.fx". We can do this by using the function D3DX11CompileFromFile(): HRESULT WINAPI D3DX11CompileFromFile( LPCSTR pSrcFile, CONST D3D10_SHADER_MACRO* pDefines, LPD3D10INCLUDE pInclude, LPCSTR pFunctionName, LPCSTR pProfile, UINT Flags1, UINT Flags2, ID3DX11ThreadPump* pPump, ID3D10Blob** ppShader, ID3D10Blob** ppErrorMsgs, HRESULT* pHResult); Where each parameter is described below: pSrcFile - A string containing the file name that the shader is in. pDefines - A pointer to an array of macros. We can set this to NULL pInclude - This is a pointer to an include interface. If our shader uses #include in the file, we must not put NULL here, but our shader does not have an include, so we set this to NULL. pFunctionName - This is the name of the shader function in that file name. pProfile - The version of the shader you want to use. Direct3D 11 supports shader version 5.0. However, on my laptop I need to set this to "vs_4_0" and "ps_4_0". Flags1 - Compile flags, we will set this to NULL. Flags2 - Effect flags. We will also set this to NULL. pPump - This has to do with multi-threading. We set NULL so the function will not return until it has completed. ppShader - This is the returned shader. It is not the actual shader, but more like a buffer containing the shader and information about the shader. We will then use this buffer to create the actual shader. ppErrorMsgs - This returns the list of errors and warnings that happened while compiling the shader. These errors and warnings are the same that you can see in the bottom of the debugger. pHResult - This is the returned HRESULT. We did it so "hr = " this function, but you can also put "&hr" for this parameter to do the same thing. hr = D3DX11CompileFromFile(L"Effects.fx", 0, 0, "VS", "vs_5_0", 0, 0, 0, &VS_Buffer, 0, 0); hr = D3DX11CompileFromFile(L"Effects.fx", 0, 0, "PS", "ps_5_0", 0, 0, 0, &PS_Buffer, 0, 0); ##Creating the Shaders## ( ID3D11Device::CreateVertexShader() ) First we create an HRESULT object called hr to use for error checking. I have not included error checking in order to keep the code more clear and condensed, but I will explain at the end of this lesson how you can impliment error checking. HRESULT CreateVertexShader( [in] const void *pShaderBytecode, [in] SIZE_T BytecodeLength, [in] ID3D11ClassLinkage *pClassLinkage, [in, out] ID3D11VertexShader **ppVertexShader) = 0; ); HRESULT CreatePixelShader( [in] const void *pShaderBytecode, [in] SIZE_T BytecodeLength, [in] ID3D11ClassLinkage *pClassLinkage, [in, out] ID3D11PixelShader **ppPixelShader) = 0; ); Where each parameter is described below: pShaderBytecode - This is a pointer to the start of the shaders buffer. BytecodeLength - This is the size of the buffer. pClassLinkage - A pointer to a class linkage interface. We will set this to NULL. ppVertexShader - This is our returned vertex shader. ppPixelShader - This is our returned pixel shader. hr = d3d11Device->CreateVertexShader(VS_Buffer->GetBufferPointer(), VS_Buffer->GetBufferSize(), NULL, &VS); hr = d3d11Device->CreatePixelShader(PS_Buffer->GetBufferPointer(), PS_Buffer->GetBufferSize(), NULL, &PS); ##Setting the Shaders ## ( ID3D11DeviceContext::VSSetShader() ) Now that we've compiled and created our shaders, we need to set them as the current pipeline shaders. We can do this by calling ID3D11DeviceContext::VSSetShader() if we want to set the vertex shader, and ID3D11DeviceContext::VSSetShader() if we want to set the pixel shader (there are other shader we will learn to set later, such as the geometry shader). Most of the time, an application will use different sets of shaders for different sets of geometry, for example, later we will be using a separate pixel shader for when we draw our skybox. Because of this, you will be setting the shaders during runtime instead of only in the scene setup fuction. Remember that direct3d is a "state-machine", where it will keep the current state and settings until it is changed later, so don't expect direct3d to set the shaders back to a default after you have set them in the code, you need to always set the right shaders before you render you stuff. That goes for render states and other things too. We'll talk about render states in a later chaper too. void VSSetShader( [in] ID3D11VertexShader *pVertexShader, [in] (NumClassInstances) ID3D11ClassInstance *const *ppClassInstances, [in] UINT NumClassInstances); ); void PSSetShader( [in] ID3D11PixelShader *pPixelShader, [in] (NumClassInstances) ID3D11ClassInstance *const *ppClassInstances, [in] UINT NumClassInstances); ); Where each parameter is described below: pVertexShader - This is our Vertex Shader. pPixelShader - This is our Pixe Shader. ppClassInstances - This is only used if our shader uses and interface. Set to NULL. NumClassInstances - This is the number of class-instances in the array from ppClassInstances. We set to 0 because there isn't one. d3d11DevCon->VSSetShader(VS, 0, 0); d3d11DevCon->PSSetShader(PS, 0, 0); ##Creating the Vertex Buffer ## ( ID3D11Buffer ) Now we need to create our vertex buffer. We start by making an array of vertices using our Vertex structure. After we have an array of vertices, we will describe our vertex buffer by filling out a D3D11_BUFFER_DESC structure, and making sure it's empty by calling ZeroMemory(). The D3D11_BUFFER_DESC looks like this: typedef struct D3D11_BUFFER_DESC { UINT ByteWidth; D3D11_USAGE Usage; UINT BindFlags; UINT CPUAccessFlags; UINT MiscFlags; UINT StructureByteStride; } D3D11_BUFFER_DESC; Where each member is described below: ByteWidth - This is the size in bytes of our buffer. Usage - A D3D11_USAGE type describing how our buffer will be read from and written to. BindFlags - We specify D3D11_BIND_VERTEX_BUFFER since this is a vertex buffer. CPUAccessFlags - This says how our buffer will be used by the CPU. We can set this to NULL MiscFlags - Extra flags we will not use, set this to NULL too StructureByteStride - Not used here, set this to NULL. Now that we have a description of our buffer, we need to fill out a D3D11_SUBRESOURCE_DATA structure with the data we want in our buffer. The structure looks like this: typedef struct D3D11_SUBRESOURCE_DATA { const void *pSysMem; UINT SysMemPitch; UINT SysMemSlicePitch; } D3D11_SUBRESOURCE_DATA; Where each member is described below: pSysMem - This is the data we want to put into the buffer. SysMemPitch - This is the distance in bytes from one line to the next line in a texture. It is only used for 2D and 3D textures. SysMemSlicePitch - The distance in bytes from one depth level to the next in a 3D texture. Only used for 3D textures. Now we can finally create the buffer using the buffer description and buffer data we just created. To create the buffer, all we have to do is call ID3D11Device::CreateBuffer(). The function looks like this: HRESULT CreateBuffer( [in] const D3D11_BUFFER_DESC *pDesc, [in] const D3D11_SUBRESOURCE_DATA *pInitialData, [in] ID3D11Buffer **ppBuffer ); Where each parameter is described below: pDesc - Pointer to our buffer description pInitialData - Pointer to a subresource data structure containing the data we want to put here. We can set this to NULL if we want to add data later. ppBuffer - The returned ID3D11Buffer. Vertex v[] = { Vertex( 0.0f, 0.5f, 0.5f ), Vertex( 0.5f, -0.5f, 0.5f ), Vertex( -0.5f, -0.5f, 0.5f ), }; D3D11_BUFFER_DESC vertexBufferDesc; ZeroMemory( &vertexBufferDesc, sizeof(vertexBufferDesc) ); vertexBufferDesc.Usage = D3D11_USAGE_DEFAULT; vertexBufferDesc.ByteWidth = sizeof( Vertex ) * 3; vertexBufferDesc.BindFlags = D3D11_BIND_VERTEX_BUFFER; vertexBufferDesc.CPUAccessFlags = 0; vertexBufferDesc.MiscFlags = 0; D3D11_SUBRESOURCE_DATA vertexBufferData; ZeroMemory( &vertexBufferData, sizeof(vertexBufferData) ); vertexBufferData.pSysMem = v; hr = d3d11Device->CreateBuffer( &vertexBufferDesc, &vertexBufferData, &triangleVertBuffer); ##Setting the Vertex Buffer## ( ID3D11DeviceContext::IASetVertexBuffers() ) Now that we have a vertex buffer, we need to bind it to the IA. We can do this by calling the function ID3D11Devicecontext::IASetVertexBuffers function: void IASetVertexBuffers( [in] UINT StartSlot, [in] UINT NumBuffers, [in] ID3D11Buffer *const *ppVertexBuffers, [in] const UINT *pStrides, [in] const UINT *pOffsets ); Where each parameter is described below: StartSlot - This is the input slot we can bind it to. We set 0 here. NumBuffers - This is the number of buffers we are binding. We are only binding 1. ppVertexBuffers - This is a pointer to our actual vertex buffer. pStrides - This is the size of each vertex. pOffsets - This is an ofset in bytes from the beginning of the buffer of where to start. UINT stride = sizeof( Vertex ); UINT offset = 0; d3d11DevCon->IASetVertexBuffers( 0, 1, &triangleVertBuffer, &stride, &offset ); ##Creating the Input (Vertex) Layout ## ( ID3D11Device::CreateInputLayout() ) Next, we need to create our input layout. We can do this with the function ID3D11Device::CreateInputLayout(): HRESULT CreateInputLayout( [in] const D3D11_INPUT_ELEMENT_DESC *pInputElementDescs, [in] UINT NumElements, [in] const void *pShaderBytecodeWithInputSignature, [in] SIZE_T BytecodeLength, [out] ID3D11InputLayout **ppInputLayout ); Where each parameter is described below: pInputElementDescs - This is the array of D3D11_INPUT_ELEMENT_DESC elements that contain our vertex layout. NumElements - This is the number of elements in our vertex layout. pShaderBytecodeWithInputSignature - This is a pointer to the start of our Vertex Shader. BytecodeLength - This is the size of our vertex shader. ppInputLayout - This is the returned pointer to our input (vertex) layout. hr = d3d11Device->CreateInputLayout( layout, numElements, VS_Buffer->GetBufferPointer(), VS_Buffer->GetBufferSize(), &vertLayout ); ##Setting the Input (Vertex) Layout## ( ID3D11DeviceContext::IASetInputLayout() ) We have created our vertex layout, the next thing to do is bind it to the IA as the active input (vertex) layout. We can do this by calling the function ID3D11DeviceContext::IASetInputLayout(): void STDMETHODCALLTYPE IASetInputLayout( [in] ID3D11InputLayout *pInputLayout ); Where the only parameter here is: pInputLayout - Our created input layout. d3d11DevCon->IASetInputLayout( vertLayout ); ##Setting the Primitive Topology## ( ID3D11DeviceContext::IASetPrimitiveTopology() ) This is where we tell the IA what type of primitives we are sending it. We can set the primitive topology by calling the function ID3D11DeviceContext::IASetPrimitiveTopology(). The only parameter here is a D3D11_PRIMITIVE_TOPOLOGY enumerated type. The following is a list of the common types: Point List - We can use D3D10_PRIMITIVE_TOPOLOGY_POINTLIST. By using this topology, every vertex will be drawn as an individual point. Line Strip - We can use D3D10_PRIMITIVE_TOPOLOGY_LINESTRIP. This is basically like "connect the dots". All vertice's will be part of a line. Line List - We can use D3D10_PRIMITIVE_TOPOLOGY_LINELIST. Every two vertices will create a line. The difference between this and a line strip is that in a line strip, all vertices will be connected to create lines, a continuous line. Triangle Strip - We can use D3D10_PRIMITIVE_TOPOLOGY_TRIANGLESTRIP. Here we create triangle. each triangle shares its vertices with the adjacent triangles. All triangles will be connected. Triangle List - We can use D3D10_PRIMITIVE_TOPOLOGY_TRIANGLELIST. This says that every 3 vertices make a triangle, so that not all triangles have to be connected. It is slower than a triangle strip because more vertices must be used, unlike a triangle strip, where you can have 2 triangles made with 4 vertices. In a triangle list, you need to have 6 vertices to create 2 triangles. Primitives with Adjacency - An example is D3D10_PRIMITIVE_TOPOLOGY_TRIANGLELIST_ADJ. These are only used for the geometry shader. We will not really worry about them for now. d3d11DevCon->IASetPrimitiveTopology( D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST ); ##Creating the Viewport## ( D3D11_VIEWPORT ) Now all thats left to do is create and set our viewport. The viewport will tell the RS stage of the pipeline what to draw. We can create a viewport using the D3D11_VIEWPORT structure. The viewport creates a square in pixels, which the rasterizer uses to find where to display our geometry on the client area of our window. You will also use the viewport when we introduce the depth buffer. We can set the minimum and maximum depth values, usually between 0 and 1. Then the OM will decide which pixel "fragments" to display based on their depth values. We want the viewport to cover our entire window client area, so we set the top left of the box to 0,0, and the bottom right of our box to Width,Height, which are in pixels. D3D11_VIEWPORT viewport; ZeroMemory(&viewport, sizeof(D3D11_VIEWPORT)); viewport.TopLeftX = 0; viewport.TopLeftY = 0; viewport.Width = Width; viewport.Height = Height; ##Setting the Viewport## ( ID3D11DeviceContext::RSSetViewports() ) After we have created our viewport, we need to bind it to the RS stage of the pipeline using the function ID3D11DeviceContext::RSSetViewports(). The first parameter is the number of viewports to bind, and the second is a pointer to an array of viewports. This is where you could have multiple "windows", like one for player one and one for player 2. d3d11DevCon->RSSetViewports(1, &viewport); ##Rendering the Primitive## ( ID3D11DeviceContext::Draw() ) Now we go down to our DrawScene() function, where we changed to background color back to black using a float array containing 4 values, RBGA. The new line here is the Draw function. The first parameter is the number of vertices to draw, and the second is an offset from the beginning of the array of vertices to start drawing. void DrawScene() { float bgColor[4] = {(0.0f, 0.0f, 0.0f, 0.0f)}; d3d11DevCon->ClearRenderTargetView(renderTargetView, bgColor); d3d11DevCon->Draw( 3, 0 ); SwapChain->Present(0, 0); } ##Effect File (Shaders)## ( Vertex Shader ) Let's move on to the Effect file now. We have a very very very simple effect file containing only a vertex and pixel shader. and each of these functions do the very minimum. Here we name our vertex shader "VS", and set its arguments to accept a 4d float value called "Pos". The "POSITION" is where the IA will send the position element in our vertex structure, since we told it to do so in our input layout. You can change the "POSITION" to anything else, as long as you remember to change it in your vertex layout. The only thing we do here is return the Pos value to the next active stage in the pipeline. float4 VS(float4 inPos : POSITION) : SV_POSITION { return inPos; } ##Effect File (Shaders)## ( Pixel Shader ) The last thing we do in this lesson is create a Pixel Shader function. The only thing this function does is return the color blue for every pixel passed into it from the rasterizer stage. Later however, we will be returning texture coords, normals, and/or color values from the vertex shader, and taking them as input in this pixel shader. We can then use all these values to determine the final color of our pixel. This is also where we will impliment pixel-based lighting. ##Exercise:## 1. In this lesson we drew a triangle. Try to draw a square. HINT -->> (Vertex Buffer) 2. Try to draw lines and points. HINT -->> (Primitive Topology) 3. Change the color of the primitive. HINT -->> (Pixel Shader) 4. Display the scene on only 1/4th of the client area. HINT -->> (Viewport) Heres the full code: //Include and link appropriate libraries and headers// #pragma comment(lib, "d3d11.lib") #pragma comment(lib, "d3dx11.lib") #pragma comment(lib, "d3dx10.lib") #include <windows.h> #include <d3d11.h> #include <d3dx11.h> #include <D3DX10.h> #include <xnamath.h> //Global Declarations - Interfaces// IDXGISwapChain* SwapChain; ID3D11Device* d3d11Device; ID3D11DeviceContext* d3d11DevCon; ID3D11RenderTargetView* renderTargetView; ///////////////**************new**************//////////////////// ID3D11Buffer* triangleVertBuffer; ID3D11VertexShader* VS; ID3D11PixelShader* PS; ID3D10Blob* VS_Buffer; ID3D10Blob* PS_Buffer; ID3D11InputLayout* vertLayout; ///////////////**************new**************//////////////////// //Global Declarations - Others// LPCTSTR WndClassName = L"firstwindow"; HWND hwnd = NULL; HRESULT hr; const int Width = 300; const int Height = 300; //Function Prototypes// bool InitializeDirect3d11App(HINSTANCE hInstance); void CleanUp(); bool InitScene(); void UpdateScene(); void DrawScene(); bool InitializeWindow(HINSTANCE hInstance, int ShowWnd, int width, int height, bool windowed); int messageloop(); LRESULT CALLBACK WndProc(HWND hWnd, UINT msg, WPARAM wParam, LPARAM lParam); ///////////////**************new**************//////////////////// //Vertex Structure and Vertex Layout (Input Layout)// struct Vertex //Overloaded Vertex Structure { Vertex(){} Vertex(float x, float y, float z) : pos(x,y,z){} XMFLOAT3 pos; }; D3D11_INPUT_ELEMENT_DESC layout[] = { { "POSITION", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0, D3D11_INPUT_PER_VERTEX_DATA, 0 }, }; UINT numElements = ARRAYSIZE(layout); ///////////////**************new**************//////////////////// int WINAPI WinMain(HINSTANCE hInstance, //Main windows function HINSTANCE hPrevInstance, LPSTR lpCmdLine, int nShowCmd) { if(!InitializeWindow(hInstance, nShowCmd, Width, Height, true)) { MessageBox(0, L"Window Initialization - Failed", L"Error", MB_OK); return 0; } if(!InitializeDirect3d11App(hInstance)) //Initialize Direct3D { MessageBox(0, L"Direct3D Initialization - Failed", L"Error", MB_OK); return 0; } if(!InitScene()) //Initialize our scene { MessageBox(0, L"Scene Initialization - Failed", L"Error", MB_OK); return 0; } messageloop(); CleanUp(); return 0; } bool InitializeWindow(HINSTANCE hInstance, int ShowWnd, int width, int height, bool windowed) { typedef struct _WNDCLASS { UINT cbSize; UINT style; WNDPROC lpfnWndProc; int cbClsExtra; int cbWndExtra; HANDLE hInstance; HICON hIcon; HCURSOR hCursor; HBRUSH hbrBackground; LPCTSTR lpszMenuName; LPCTSTR lpszClassName; } WNDCLASS; WNDCLASSEX wc; wc.cbSize = sizeof(WNDCLASSEX); wc.style = CS_HREDRAW | CS_VREDRAW; wc.lpfnWndProc = WndProc; wc.cbClsExtra = NULL; wc.cbWndExtra = NULL; wc.hInstance = hInstance; wc.hIcon = LoadIcon(NULL, IDI_APPLICATION); wc.hCursor = LoadCursor(NULL, IDC_ARROW); wc.hbrBackground = (HBRUSH)(COLOR_WINDOW + 2); wc.lpszMenuName = NULL; wc.lpszClassName = WndClassName; wc.hIconSm = LoadIcon(NULL, IDI_APPLICATION); if (!RegisterClassEx(&wc)) { MessageBox(NULL, L"Error registering class", L"Error", MB_OK | MB_ICONERROR); return 1; } hwnd = CreateWindowEx( NULL, WndClassName, L"Lesson 4 - Begin Drawing", WS_OVERLAPPEDWINDOW, CW_USEDEFAULT, CW_USEDEFAULT, width, height, NULL, NULL, hInstance, NULL ); if (!hwnd) { MessageBox(NULL, L"Error creating window", L"Error", MB_OK | MB_ICONERROR); return 1; } ShowWindow(hwnd, ShowWnd); UpdateWindow(hwnd); return true; } bool InitializeDirect3d11App(HINSTANCE hInstance) { //Describe our Buffer DXGI_MODE_DESC bufferDesc; ZeroMemory(&bufferDesc, sizeof(DXGI_MODE_DESC)); bufferDesc.Width = Width; bufferDesc.Height = Height; bufferDesc.RefreshRate.Numerator = 60; bufferDesc.RefreshRate.Denominator = 1; bufferDesc.Format = DXGI_FORMAT_R8G8B8A8_UNORM; bufferDesc.ScanlineOrdering = DXGI_MODE_SCANLINE_ORDER_UNSPECIFIED; bufferDesc.Scaling = DXGI_MODE_SCALING_UNSPECIFIED; //Describe our SwapChain DXGI_SWAP_CHAIN_DESC swapChainDesc; ZeroMemory(&swapChainDesc, sizeof(DXGI_SWAP_CHAIN_DESC)); swapChainDesc.BufferDesc = bufferDesc; swapChainDesc.SampleDesc.Count = 1; swapChainDesc.SampleDesc.Quality = 0; swapChainDesc.BufferUsage = DXGI_USAGE_RENDER_TARGET_OUTPUT; swapChainDesc.BufferCount = 1; swapChainDesc.OutputWindow = hwnd; swapChainDesc.Windowed = TRUE; swapChainDesc.SwapEffect = DXGI_SWAP_EFFECT_DISCARD; //Create our SwapChain hr = D3D11CreateDeviceAndSwapChain(NULL, D3D_DRIVER_TYPE_HARDWARE, NULL, NULL, NULL, NULL, D3D11_SDK_VERSION, &swapChainDesc, &SwapChain, &d3d11Device, NULL, &d3d11DevCon); //Create our BackBuffer ID3D11Texture2D* BackBuffer; hr = SwapChain->GetBuffer( 0, __uuidof( ID3D11Texture2D ), (void**)&BackBuffer ); //Create our Render Target hr = d3d11Device->CreateRenderTargetView( BackBuffer, NULL, &renderTargetView ); BackBuffer->Release(); //Set our Render Target d3d11DevCon->OMSetRenderTargets( 1, &renderTargetView, NULL ); return true; } void CleanUp() { //Release the COM Objects we created SwapChain->Release(); d3d11Device->Release(); d3d11DevCon->Release(); renderTargetView->Release(); ///////////////**************new**************//////////////////// triangleVertBuffer->Release(); VS->Release(); PS->Release(); VS_Buffer->Release(); PS_Buffer->Release(); vertLayout->Release(); ///////////////**************new**************//////////////////// } ///////////////**************new**************//////////////////// bool InitScene() { //Compile Shaders from shader file hr = D3DX11CompileFromFile(L"Effects.fx", 0, 0, "VS", "vs_4_0", 0, 0, 0, &VS_Buffer, 0, 0); hr = D3DX11CompileFromFile(L"Effects.fx", 0, 0, "PS", "ps_4_0", 0, 0, 0, &PS_Buffer, 0, 0); //Create the Shader Objects hr = d3d11Device->CreateVertexShader(VS_Buffer->GetBufferPointer(), VS_Buffer->GetBufferSize(), NULL, &VS); hr = d3d11Device->CreatePixelShader(PS_Buffer->GetBufferPointer(), PS_Buffer->GetBufferSize(), NULL, &PS); //Set Vertex and Pixel Shaders d3d11DevCon->VSSetShader(VS, 0, 0); d3d11DevCon->PSSetShader(PS, 0, 0); //Create the vertex buffer Vertex v[] = { Vertex( 0.0f, 0.5f, 0.5f ), Vertex( 0.5f, -0.5f, 0.5f ), Vertex( -0.5f, -0.5f, 0.5f ), }; D3D11_BUFFER_DESC vertexBufferDesc; ZeroMemory( &vertexBufferDesc, sizeof(vertexBufferDesc) ); vertexBufferDesc.Usage = D3D11_USAGE_DEFAULT; vertexBufferDesc.ByteWidth = sizeof( Vertex ) * 3; vertexBufferDesc.BindFlags = D3D11_BIND_VERTEX_BUFFER; vertexBufferDesc.CPUAccessFlags = 0; vertexBufferDesc.MiscFlags = 0; D3D11_SUBRESOURCE_DATA vertexBufferData; ZeroMemory( &vertexBufferData, sizeof(vertexBufferData) ); vertexBufferData.pSysMem = v; hr = d3d11Device->CreateBuffer( &vertexBufferDesc, &vertexBufferData, &triangleVertBuffer); //Set the vertex buffer UINT stride = sizeof( Vertex ); UINT offset = 0; d3d11DevCon->IASetVertexBuffers( 0, 1, &triangleVertBuffer, &stride, &offset ); //Create the Input Layout d3d11Device->CreateInputLayout( layout, numElements, VS_Buffer->GetBufferPointer(), VS_Buffer->GetBufferSize(), &vertLayout ); //Set the Input Layout d3d11DevCon->IASetInputLayout( vertLayout ); //Set Primitive Topology d3d11DevCon->IASetPrimitiveTopology( D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST ); //Create the Viewport D3D11_VIEWPORT viewport; ZeroMemory(&viewport, sizeof(D3D11_VIEWPORT)); viewport.TopLeftX = 0; viewport.TopLeftY = 0; viewport.Width = Width; viewport.Height = Height; //Set the Viewport d3d11DevCon->RSSetViewports(1, &viewport); return true; } ///////////////**************new**************//////////////////// void UpdateScene() { } ///////////////**************new**************//////////////////// void DrawScene() { //Clear our backbuffer float bgColor[4] = {(0.0f, 0.0f, 0.0f, 0.0f)}; d3d11DevCon->ClearRenderTargetView(renderTargetView, bgColor); //Draw the triangle d3d11DevCon->Draw( 3, 0 ); //Present the backbuffer to the screen SwapChain->Present(0, 0); } ///////////////**************new**************//////////////////// int messageloop(){ MSG msg; ZeroMemory(&msg, sizeof(MSG)); while(true) { BOOL PeekMessageL( LPMSG lpMsg, HWND hWnd, UINT wMsgFilterMin, UINT wMsgFilterMax, UINT wRemoveMsg ); if (PeekMessage(&msg, NULL, 0, 0, PM_REMOVE)) { if (msg.message == WM_QUIT) break; TranslateMessage(&msg); DispatchMessage(&msg); } else{ // run game code UpdateScene(); DrawScene(); } } return msg.wParam; } LRESULT CALLBACK WndProc(HWND hwnd, UINT msg, WPARAM wParam, LPARAM lParam) { switch( msg ) { case WM_KEYDOWN: if( wParam == VK_ESCAPE ){ DestroyWindow(hwnd); } return 0; case WM_DESTROY: PostQuitMessage(0); return 0; } return DefWindowProc(hwnd, msg, wParam, lParam); }
Comments
I'm using VS2015 Community Edition and D3DX is no longer available (have not installed legacy june 2010 DirectX SDK). Some research for replacements of D3DX library: http://blogs.msdn.com/b/chuckw/archive/2013/08/21/living-without-d3dx.aspx D3DX11CompileFromFile --> D3DCompileFromFile ID3DBlob *pErrorBlob = NULL; UINT flags = D3DCOMPILE_ENABLE_STRICTNESS; #ifdef _DEBUG flags |= D3DCOMPILE_DEBUG; #endif // _DEBUG const D3D_SHADER_MACRO defines_a[] = { { NULL, NULL } }; // Compile Shaders from shader file // https://msdn.microsoft.com/de-de/library/windows/desktop/hh968107(v=vs.85).aspx hr = D3DCompileFromFile(L"VertexShader.hlsl", defines_a, D3D_COMPILE_STANDARD_FILE_INCLUDE, "VS", "vs_4_0", flags, 0, &g_pVertexShaderBuffer, &pErrorBlob); Note: g_pVertexShaderBuffer is no longer a ID3D10Blob, it changed to ID3DBlob. Exercise: 1. In this lesson we drew a triangle. Try to draw a square. HINT -->> (Vertex Buffer) bool InitScene() { // Create the vertex buffer (vertices must be in clock-wise order) Vertex vertices_a[] = { // left triangle Vertex(-0.5f, 0.5f, 0.5f), // upper left corner Vertex(0.5f, -0.5f, 0.5f), // lower right corner Vertex(-0.5f, -0.5f, 0.5f), // lower left corner // right triangle Vertex(-0.5f, 0.5f, 0.5f), // upper left corner Vertex(0.5f, 0.5f, 0.5f), // upper right corner Vertex(0.5f, -0.5f, 0.5f), // lower right corner }; D3D11_BUFFER_DESC vertexBufferDesc; ZeroMemory(&vertexBufferDesc, sizeof(vertexBufferDesc)); vertexBufferDesc.ByteWidth = sizeof(Vertex) * 6; // 6 vertices for a quad } void DrawScene() { // Draw the triangle g_pDeviceContext->Draw(3, 0); // draw the left triangle (vertices: 0, 1, 2) g_pDeviceContext->Draw(3, 3); // draw the right triangle (vertices: 3, 4, 5) } 2. Try to draw lines and points. HINT -->> (Primitive Topology) bool InitScene() { // Set Primitive Topology g_pDeviceContext->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST); // 3 vertices per triangle //g_pDeviceContext->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_POINTLIST); // 1 vertex per point //g_pDeviceContext->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_LINELIST); // 2 vertices per line } 3. Change the color of the primitive. HINT -->> (Pixel Shader) PixelShader file float4 PS() : SV_TARGET { return float4(1.0f, 1.0f, 1.0f, 1.0f); // change one of the first 3 values (0.0f - 1.0f; r/g/b) } 4. Display the scene on only 1/4th of the client area. HINT -->> (Viewport) bool InitScene() { // Create the Viewport D3D11_VIEWPORT viewport; ZeroMemory(&viewport, sizeof(D3D11_VIEWPORT)); viewport.TopLeftX = 0; viewport.TopLeftY = 0; viewport.Width = g_width / 4; // divide by 4 to only use 1/4 of client area (width) viewport.Height = g_height / 4; // divide by 4 to only use 1/4 of client area (height) }
on Feb 15 `16
wlasar64
If you are using VS2015, make sure to change your properties->HLSL Compiler->Shader Type to Effects, and Shader Model to 5.0
on Jul 14 `16
ericxtang
Why I change my struct Vertex like this:Vertex v[] = { Vertex(-0.5f, -0.5f, 0.5f), Vertex(-0.5f, 0.5f, 0.5f), Vertex(0.5f, -0.5f, 0.5f), Vertex(0.5f, 0.5f, 0.5f) };vertexBufferDesc.Usage = D3D11_USAGE_DEFAULT; vertexBufferDesc.ByteWidth = sizeof(Vertex) * 4; vertexBufferDesc.BindFlags = D3D11_BIND_VERTEX_BUFFER; vertexBufferDesc.CPUAccessFlags = 0; vertexBufferDesc.MiscFlags = 0; D3D11_SUBRESOURCE_DATA vertexBufferData; ZeroMemory(&vertexBufferData, sizeof(vertexBufferData)); vertexBufferData.pSysMem = v; hr = d3d11Device->CreateBuffer(&vertexBufferDesc, &vertexBufferData, &squareVertBuffer); ZeroMemory(&vertexBufferDesc, sizeof(vertexBufferDesc)); But it doesn't show me the square I wanted!
on Jan 29 `17
benyang
please ask questions like this in the questions section. comments are not really code friendly
on Jan 30 `17
iedoc
heres some nice information: 1 - the "Effects.fx" file must be shared with exe(i don't know if it can be used with resources... but maybe); 2 - the texture size is between 0.0 and 1.0... but the texture position(screen coordenates way)(i'm speaking on triangle position) is from -1.0 to 1.0. the zero it's the center. both are float values.
on Jan 25 `18
cambalinho