Microsoft XNA Game Studio Creator’s Guide- P12

Chia sẻ: Cong Thanh | Ngày: | Loại File: PDF | Số trang:30

lượt xem

Microsoft XNA Game Studio Creator’s Guide- P12

Mô tả tài liệu
  Download Vui lòng tải xuống để xem tài liệu đầy đủ

Microsoft XNA Game Studio Creator’s Guide- P12:The release of the XNA platform and specifically the ability for anyone to write Xbox 360 console games was truly a major progression in the game-programming world. Before XNA, it was simply too complicated and costly for a student, software hobbyist, or independent game developer to gain access to a decent development kit for a major console platform.

Chủ đề:

Nội dung Text: Microsoft XNA Game Studio Creator’s Guide- P12

  1. 308 MICROSOFT XNA GAME STUDIO CREATOR’S GUIDE Figure 19-3 shows several rockets at various elevations (on the Y axis) and at dif- ferent stages of flight. Over time, the projectiles lose momentum and gravity pulls them to the ground. The overall effect creates a nice arcing projectile path. Game developers will often use real-world physics to create more realistic graph- ics effects. The physical properties that they consider may include gravity, friction, force, velocity, acceleration, viscosity, and much more. In case you’re wondering, game development companies will often implement pseudo-physics in their algo- rithms. As long as the effect looks correct and is efficient, an approximation of the laws of physics is usually the faster and more effective alternative. After all, as a simu- lation approaches reality, it can become so complex that it loses its value. However, even when the code deviates from the laws of physics, realistic algorithms usually consider some portion of the real physical model. Once the launch velocity and direction have been obtained, the effect of gravity can be computed and the X, Y, and Z positions of the projectile can be calculated over time. The X and Z positions are calculated using the same equations as the Lin- ear Projectile algorithm to obtain the projectile’s position over time: Xt = Xstart + Vx * t Zt = Zstart + Vz * t The Arcing Projectile algorithm treats the calculation of the Y position over time as a special case that also considers gravity. Initially, the projectile’s velocity is pow- erful enough to defy gravity—otherwise, there would be insufficient energy to launch FIGURE 19-3 Considering the effect of gravity over time
  2. C H A P T E R 1 9 309 Ballistics the projectile into the air. However, over time, the projectile loses its momentum and gravity becomes the strongest force on the object. This gravitational pull is defined by a constant value of acceleration, g, which represents the Earth’s gravity. The ac- cepted value for g equals 9.8 meters / second2 (32 ft/s2). After the Earth’s gravity is fac- tored in, the equation used for calculating the Y position over time becomes: Yt = Ystart + Vy * t - 0.5 * g * t2 Implementing these projectile algorithms in code is simple. The first example in this chapter implements the Linear Projectile algorithm. Then, in the example that follows, the Linear Projectile algorithm is converted into an Arcing Projectile algo- rithm. L INEAR PROJECTILES EXAMPLE This example demonstrates how to add projectiles that can be launched on a linear path from a rocket launcher, as shown back in Figure 19-1. In this example, you will shoot ten rockets into the air at a time. When a trigger or spacebar event occurs, the first available rocket (that is not already in flight) is launched. At the time of launch, the rocket is given a position and direction to start it on an outward journey from the tip of the rocket launcher. The rocket launcher’s po- sition and direction are based on the camera’s current position and Look direction. Also, during the launch, the activation state for the projectile is set to true, and re- mains set to true until the projectile reaches the end of the path. The activation state prevents the projectile from being reused while it is in flight. The projectile properties are reset every time the projectile is launched. This example begins with either the MGHWinBaseCode or MGH360BaseCode project located in the BaseCode folder on this book’s website. You will create a Projectile class to assist with the implementation of your projectiles. You will use Projectile to keep track of each rocket and to update its position. The Projectile class can be created from scratch in the Solution Ex- plorer. To generate it, right-click the project and choose Add New Item. Then, choose the Code File icon and enter Projectile.cs as the Name in the Add New Item di- alog. When you click Add, GS will generate an empty Projectile.cs file. First, add the following code shell to start your Projectile class: using Microsoft.Xna.Framework; namespace Projectiles{ public class Projectile{ } }
  3. 310 MICROSOFT XNA GAME STUDIO CREATOR’S GUIDE Class-level declarations are also required for storing the position, direction, and activation state of each projectile. An additional variable, for storing the size of the world, enables a check to determine whether the projectile has flown out of sight. This tells you when to deactivate the projectile. To allow access to these variables throughout the class, we place their declarations at the top of the Projectile class (inside the class declaration): public Vector3 position, previousPosition; // rocket position private Vector3 speed; // relative change in X,Y,Z public Matrix directionMatrix; // direction transformations public bool active; // visibility private float boundary; // edge of world on X and Z private float seconds; // seconds since launch private Vector3 startPosition; // launch position When the program begins, each projectile needs to be created only once. After they are created, the projectiles remain inactive until the user launches them. Later, you will add a method to deactivate a projectile when it flies past the boundaries of the world. To set the projectile flight range and activation state when the projectile is ini- tialized, add this constructor to the Projectile class: public Projectile(float border){ boundary = border; active = false; } The projectile’s position, direction, and activation state are set according to the camera’s position and Look direction at the time of the launch. The rocket speed is actually based on the direction, which is a relative change in X, Y, and Z. Including the Launch() method in the Projectile class will enable proper initialization of these attributes during the launch. public void Launch(Vector3 look, Vector3 start){ position = startPosition = start; // start at camera speed = Vector3.Normalize(look); // unitize direction active = true; // make visible seconds = 0.0f; // used with gravity only } As discussed in Chapter 8, an object’s direction can be calculated from the object’s speed vector. Adding SetDirectionMatrix() to your Projectile class will
  4. C H A P T E R 1 9 311 Ballistics provide the method you need to make your rocket point in the direction it is travel- ing. This routine applies to both the Linear Projectile algorithm and the Arcing Pro- jectile algorithm. For the Linear Projectile algorithm, the rocket direction remains constant as the rocket travels outwards. For the Arcing Projectile algorithm, SetDirectionMatrix() will launch the rocket with the original launcher direc- tion, and then it will gradually drop the rocket, nose downward, as the gravitational pull takes over: private void SetDirectionMatrix(){ Vector3 Look = position - previousPosition; Look.Normalize(); Vector3 Up = new Vector3(0.0f, 1.0f, 0.0f); // fake Up to get Vector3 Right = Vector3.Cross(Up, Look); Right.Normalize(); Up = Vector3.Cross(Look, Right); // calculate Up with Up.Normalize(); // correct vectors Matrix matrix = new Matrix(); // compute direction matrix matrix.Right = Right; matrix.Up = Up; matrix.Forward = Look; matrix.M44 = 1.0f; // W is set to 1 to enable transforms directionMatrix = matrix; } The projectile’s position is updated before being drawn each frame. Also, in every frame, the projectile’s position is incremented by a time-scaled direction vector, which ensures that the rocket flies in the path set by the camera when the rocket is launched. When the projectile location exceeds one of the outer boundaries, it is de- activated so that it can be deactivated and made available for the next launch. The Up da te Pro ject ile( ) met hod i mp l e me nt s t h i s ro ut i n e. Ad d i n g UpdateProjectile() to the projectile class ensures that your projectile positions are updated while they are active. The method also deactivates the projectiles after they reach the outer limits of your world. public void UpdateProjectile(GameTime gameTime){ previousPosition = position; // archive last position position += speed // update current position * (float)gameTime.ElapsedGameTime.Milliseconds/90.0f; SetDirectionMatrix();
  5. 312 MICROSOFT XNA GAME STUDIO CREATOR’S GUIDE // deactivate if outer border exceeded on X or Z if (position.Z > 2.0f * boundary || position.X > 2.0f * boundary || position.Z
  6. C H A P T E R 1 9 313 Ballistics Once the rocket.fbx and launcher.fbx models are referenced from the Solution Ex- plorer, InitializeModels() can initialize the model objects and their transfor- mation matrices: void InitializeModels(){ rocketModel = Content.Load("Models\\rocket"); rocketMatrix = new Matrix[rocketModel.Bones.Count]; rocketModel.CopyAbsoluteBoneTransformsTo(rocketMatrix); launcherModel = Content.Load("Models\\launcher"); launcherMatrix = new Matrix[launcherModel.Bones.Count]; launcherModel.CopyAbsoluteBoneTransformsTo(launcherMatrix); } To initialize the rocket, when the program begins, call InitializeModels() from LoadContent(): InitializeModels(); Remember to only reference the models in your project. Both the models and textures should be placed in the Models folder. However, the content pipeline is unable to load more than one file with the same name from the same folder because no extension is required. Now that the models have been loaded, projectile objects can be created to track each rocket’s direction and whereabouts. Declaring ten projectile objects in the game class in the module declarations area will make them available for your use through- out the game class. Next, we’ll draw the rocket launcher. The rocket launcher travels with the camera and rotates about the X axis—with changes to the view position on Y whenever the user moves the mouse or right thumbstick up or down. The model rocket launcher was designed to simplify the transformations for this movement. The rocket launcher’s base is positioned at the origin, and the barrel is centered around the Z axis and is positioned further out on Z. By design of the camera, the launcher’s rota- tion range about the X axis is half a circle (or π radians). If the rocket launcher is pointed directly upward, the view position on Y would equal 0.5, and if the rocket launcher is pointed directly downward, the view position would be –0.5 (see Figure 19-4). As discussed in Chapter 7, because XNA uses the Right Hand Rule, a negative ro- tation around the X axis will point the launcher upward. Using the same logic, a posi- tive rotation about the X axis will point the launcher downward. The launcher must be rotated about the X axis to match the camera’s Look direction about the Y axis.
  7. 314 MICROSOFT XNA GAME STUDIO CREATOR’S GUIDE FIGURE 19-4 Rocket launcher rotation range around the X axis With this information, you can calculate the rocket launcher’s rotation angle about the X axis with the following equation: rotationX = Matrix.CreateRotationX(-MathHelper.Pi*look.Y ); The launcher must also be rotated about the Y axis to match the camera’s Look di- rection about the Y axis. And finally, to finish the transformation using the I.S.R.O.T. sequence, the launcher must be translated by an amount that is equivalent to the distance from the origin to the camera. An extra shift downward on the Y axis is added to this translation to move the launcher downward slightly so it does not block your view. Add DrawLauncher() to your game class to move and rotate the rocket launcher with your camera: private void DrawLauncher(Model model){ // 1: declare matrices Matrix world, translation, scale, rotationX, rotationY; // 2: initialize matrices scale = Matrix.CreateScale(0.002f, 0.002f, 0.002f); translation = Matrix.CreateTranslation(cam.position.X, BASE_HEIGHT, cam.position.Z);
  8. C H A P T E R 1 9 315 Ballistics Vector3 look = cam.view - cam.position; rotationX = Matrix.CreateRotationX(-MathHelper.Pi*look.Y ); rotationY = Matrix.CreateRotationY((float)Math.Atan2(look.X, look.Z)); // 3: build cumulative matrix using I.S.R.O.T. sequence // identity,scale,rotate,orbit(translate & rotate),translate world = scale * rotationX * rotationY * translation; // 4: set shader parameters foreach (ModelMesh mesh in model.Meshes){ foreach (BasicEffect effect in mesh.Effects){ effect.World = launcherMatrix[mesh.ParentBone.Index]*world; effect.View = cam.viewMatrix; effect.Projection = cam.projectionMatrix; effect.EnableDefaultLighting(); effect.SpecularPower = 0.01f; } // 5: draw object mesh.Draw(); } } To actually see the rocket launcher, you obviously need to call the method to draw it. Adding DrawLauncher() to the end of the Draw() method will draw the rocket when other objects are rendered: DrawLauncher(launcherModel); Because the rocket launcher’s rotation angle about the X axis changes with the view position on Y, if the right thumbstick or mouse shifts the view all the way up or all the way down, you can actually see the base of the launcher, which spoils the ef- fect. Inside the camera class in the UpdateView() method, you’ll replace the code that caps the Y view position so that it can no longer exceed 0.30 or fall below –0.10, which prevents you from pointing the launcher into the ground. The end result is that whatever angle you point, it looks as though you are always holding the rocket launcher: const float LOWER_LIMIT = -0.1f; const float UPPER_LIMIT = 0.3f; if (Qlook.Y > LOWER_LIMIT && Qlook.Y < UPPER_LIMIT) The code that you use to launch the rocket (from the game class) is contained in the LaunchRocket() method. This routine searches through the array of projectiles
  9. 316 MICROSOFT XNA GAME STUDIO CREATOR’S GUIDE and finds the first inactive projectile available. When an inactive projectile is found, LaunchRocket() sets the starting position and direction to equal the camera posi- tion and Look direction. The transformations use the I.S.R.O.T. sequence. Their implementation to angle and position the rocket at the tip of the launcher is summarized in the comments in- cluded with this code. The starting position is needed to help track the location of each rocket. To create the required transformation, and record the initial starting position of the rocket, we can use the matrix math discussed in Chapter 8 and Chapter 16. Once the starting position is computed using matrices, the first row of the matrix that contains the po- sition information is stored in a vector. This position vector can be used later to up- date the position of the rocket by incrementing the position by a time-scaled direction vector. As you can see, it really does pay to understand how to employ linear algebra beyond just using the Matrix objects and methods that are shipped with XNA. Add LaunchRocket() to your game class to find the first available rocket when a launch is triggered and to calculate and store the starting position and direction of the rocket: private void LaunchRocket(int i){ Matrix orbitTranslate, orbitX, orbitY, translate, position; Vector3 look, start; // create matrix and store origin in first row position = new Matrix(); // zero matrix position.M14 = 1.0f; // set W to 1 so you can transform it // move to tip of launcher orbitTranslate = Matrix.CreateTranslation(0.0f, 0.0f, -0.85f); // use same direction as launcher look = cam.view - cam.position; // offset needed to rotate rocket about X to see it with camera float offsetAngle = MathHelper.Pi; // adjust angle about X with changes in Look (Forward) direction orbitX = Matrix.CreateRotationX(offsetAngle-MathHelper.Pi*look.Y); // rocket's Y direction is same as camera's at time of launch orbitY = Matrix.CreateRotationY((float)Math.Atan2(look.X,look.Z));
  10. C H A P T E R 1 9 317 Ballistics // move rocket to camera position where launcher base is also located translate = Matrix.CreateTranslation(cam.position.X,BASE_HEIGHT, cam.position.Z); // use the I.S.R.O.T. sequence to get rocket start position position = position * orbitTranslate * orbitX * orbitY * translate; // convert from matrix back to vector so it can be used for updates start = new Vector3(position.M11, position.M12, position.M13); rocket[i].Launch(look, start); } At this point, the projectile objects are initialized and your launcher is in place. Your rockets are ready, but a mechanism is required to trigger their launch. In this case, you will add code to initiate their launch when the left mouse button is clicked, or when the right trigger on the controller is pressed. To ensure that all ten rockets are not launched during this press event—which lasts over several frames— the current and previous states of the game pad and mouse are compared. To enable this input device state checking, you must add a declaration for game pad and mouse states at the class level: #if !XBOX MouseState mouseCurrent, mousePrevious; #endif GamePadState gamepad, gamepadPrevious; The projectile trigger events can now be handled at the end of the Update() method. In this block of code, you will update the mouse and game-pad states. Then you can determine new mouse button click or trigger-pull events by comparing the button press states in the current frame with release states from the previous frame: // refresh key and button states #if !XBOX mouseCurrent = Mouse.GetState(); #endif gamepad = GamePad.GetState(PlayerIndex.One); // launch rocket for right trigger and left click events if (gamepad.Triggers.Right > 0 && gamepadPrevious.Triggers.Right == 0 #if !XBOX || mouseCurrent.LeftButton == ButtonState.Pressed && mousePrevious.LeftButton == ButtonState.Released #endif
  11. 318 MICROSOFT XNA GAME STUDIO CREATOR’S GUIDE ){ // if launch event then launch next available rocket for (int i = 0; i < NUM_ROCKETS; i++) if (rocket[i].active == false){ LaunchRocket(i); break; } } // archive current state for comparison next frame gamepadPrevious = gamepad; #if !XBOX mousePrevious = mouseCurrent; #endif In each frame, the locations of all projectiles must be updated so each can be ani- mated properly along its trajectory path. The code used to trigger the update for each projectile position belongs at the end of the Update() method in the game class: // update rockets that are in flight for (int i = 0; i < NUM_ROCKETS; i++) if (rocket[i].active) rocket[i].UpdateProjectile(gameTime); Only one method is used to draw each projectile. The details are explained in the comments: private void DrawRockets(Model model, int i){ // 1: declare matrices Matrix world, scale, rotateX, translate; // 2: initialize matrices scale = Matrix.CreateScale(0.0033f, 0.0033f, 0.0033f); rotateX = Matrix.CreateRotationX(-MathHelper.Pi/2.0f); translate = Matrix.CreateTranslation(rocket[i].position); // 3: build cumulative matrix using I.S.R.O.T. sequence world = scale * rotateX * rocket[i].directionMatrix * translate; // 4: set shader parameters foreach (ModelMesh mesh in model.Meshes){ foreach (BasicEffect effect in mesh.Effects){
  12. C H A P T E R 1 9 319 Ballistics effect.World = rocketMatrix[mesh.ParentBone.Index] * world; effect.View = cam.viewMatrix; effect.Projection = cam.projectionMatrix; effect.EnableDefaultLighting(); effect.SpecularPower = 16.5f; } // 5: draw object mesh.Draw(); } } To ensure that projectiles are actually drawn, DrawRockets() needs to be called from the Draw() method. This code loops through all projectile objects and draws the active ones at their current position with their corresponding direction: for (int i = 0; i < NUM_ROCKETS; i++) if (rocket[i].active) DrawRockets(rocketModel, i); When you compile and run this program, it shows the Linear Projectile algorithm in action. Whenever the left mouse button is clicked, or a game controller trigger is pulled, a rocket is launched. Each projectile shoots outward until it reaches an arbi- trary boundary located at the outer limits of the world. A RCING PROJECTILES EXAMPLE This Arcing Projectiles example picks up where the Linear Projectile algorithm ends. When this example is complete, and the effect of gravity is factored in, the flight of each projectile will rise to a peak and then follow a descending path to the ground. Most of the code in this revised routine remains the same. However, the method that updates the rocket position will be replaced so that the gravitational pull over time is taken into consideration. In this new routine, however, initially the linear projectile algorithm is implemented long enough for the projectile to safely leave the barrel of the launcher in case it is moving. Use of constant speed at the onset creates a mini turbo boost so your players don’t risk blowing themselves up every time they pull the trigger. Once the rocket is far enough away from the launcher, gravity kicks in—then the count to the touchdown begins. The UpdateProjectile() method updates the position by factoring speed over time. In Y’s case, the height is also adjusted with the pull of gravity over time.
  13. 320 MICROSOFT XNA GAME STUDIO CREATOR’S GUIDE SetDirectionMatrix() makes this effect even more realistic by adjusting the rocket’s direction. The transformation matrix set when calling this method ensures that the rocket flies in the proper direction about the Y axis, and it also gives the rocket a tilt on the X axis, so it points upward as it climbs and then lowers as the rocket descends to the ground. Replace the existing UpdateProjectile() method with this one to implement the change: public void UpdateProjectile(GameTime gameTime){ previousPosition = position; // store position from last frame // gravity takes over after rocket clears the launcher Vector3 distanceTravelled = position - startPosition; if (distanceTravelled.Length() > 1.8f) { // gravity time starts ticking seconds += gameTime.ElapsedGameTime.Milliseconds/1000.0f; const float GRAVITY = 9.8f; // gravity constant and scaling const float SCALAR = 0.0784f; position.X += speed.X*seconds;// X uses speed vs time position.Z += speed.Z*seconds;// Z uses speed vs time position.Y += speed.Y*seconds - 0.5f*GRAVITY*seconds*seconds * SCALAR; // Y uses speed and gravity vs time } // turbo boost needed for rocket to clearly leave the launcher else position += speed // speed uses constant scalar *(float)gameTime.ElapsedGameTime.Milliseconds/90.0f; SetDirectionMatrix(); // rocket direction considers speed if (position.Y < -0.5f) // de-activate if below ground active = false; } Running the program now shows the projectiles rising in an arc. When the peak is reached, they rotate gradually so they point downward and fall back to the ground. On reaching the ground, they are deactivated and made ready for the next launch. Whether you are allowing your players to throw a ball or deploy weaponry, your ballistics are ready for launch.
  14. C H A P T E R 1 9 321 Ballistics C HAPTER 19 REVIEW EXERCISES To get the most from this chapter, try out these chapter review exercises. 1. Follow the step-by-step examples shown in this chapter to implement the Linear Projectile algorithm and Arcing Projectile algorithm, if you have not already done so. 2. State how the projectile update routine for linear projectiles differs from that for arcing projectiles. 3. Replace the model rocket with your own 3D object and make it point in the direction that it travels. Add bounding-sphere collision detection to an object in your world so that something happens when you hit it.
  15. This page intentionally left blank
  16. CHAPTER 20 Particle Effects
  17. algorithms enable effects such as rain, explo- PARTICLE sions, fire, smoke, sparkles, and much more. You could say that the effects created by the particle algorithm are only limited by your imagination. Compare the non-particle-based explosions in Space Invaders with an explosion that uses a particle algorithm—like a rocket explosion in id Soft- ware’s Quake. Quake’s rocket effect is substantially more interesting. A particle is a user-defined object that sets, stores, and updates properties for a group of related items. Each group or class of particles shares a similar but slightly randomized set of properties (for example, a group of rain particles, snow particles, fire particles, or smoke particles). Particles are usually assigned properties for life, size, color, position, and speed. As an example, a snowflake would have a starting position somewhere up in the sky, so the X and Z positions would be random but the starting position for Y would definitely be positive. The snow particle’s life starts at the beginning of the particle’s descent and ends when the snowflake reaches the ground. The snowflakes are small, but each one varies slightly in size. The snow- flake’s color property would be set to a shade of white. The snowflake’s speed would definitely be negative on the Y axis, but the X and Z speeds are random. The Y speed of the snowflake particle is varied and slow enough to allow for the snow to drift to the ground. There is no set syntax or rule for defining particles—they have different properties based on their implementation. Particles are usually regenerated on a continuous ba- sis, but some randomization is normally present for creating a dynamic and ever-changing special effect. When drawing particles, you often need transparency to remove background pix- els. This generates the image you need for your effect—such as rain, fire, or an explo- sion. You could use billboarded triangle strips for this task, or you might consider using point sprites. Because particle algorithms can be expensive in terms of system bandwidth, you should be careful not to create too many particles. Using point sprites will certainly help reduce the drag on performance to allow you to use hundreds and possibly thou- sands of particles in your game. Game developers use particle algorithms when they want to show off brilliant special effects, but when performance is an issue, they may choose a textured sprite instead of a particle algorithm. P OINT SPRITES To improve performance, point sprites are often used for particle algorithms. A point sprite is a resizable textured vertex that always faces the camera. Here are three noteworthy characteristics of point sprites that make them suitable for ren- dering particles: 324
  18. C H A P T E R 2 0 325 Particle Effects A point sprite only uses one vertex, so it saves space and boosts performance. When point sprites are enabled, textures are automatically mapped to them, so there is no need to store or set or map UV coordinates in your XNA code. Point sprites always face the camera, so there is no need to implement code to adjust their angle to view them from various directions. Point sprites can only be enabled through a shader. Much of the shader code is similar to code you have used in this book to draw a textured primitive surface. As discussed in Chapter 6, your shader usually begins with global variables that can be set from your XNA application. Here are declarations that you will need in your point sprite shader to set the texture and sizing from your XNA code: texture textureImage; // stores texture float fade; // fade as near end of life float4x4 projection : PROJECTION; // viewport perspective matrix float4x4 wvpMatrix : WORLDVIEWPROJ;// world view projection matrix float viewportHeight; // current viewport height The same texture sampler used throughout this book, and explained in Chapter 9, will also work for this point sprite shader: // filter (like a brush) for showing texture sampler textureSampler = sampler_state{ Texture = ; magfilter = LINEAR; // magfilter when bigger than actual size minfilter = LINEAR; // minfilter when smaller than actual size mipfilter = LINEAR; // to resize images close and far away }; Until now, your XNA code has been using XNA’s preset VertexDeclaration to define the type of vertex data for drawing primitive-based surfaces using data such as texture coordinates, position, color, or normal information. These preset defini- tions are convenient, but sometimes you will want to customize your vertex defini- tions. To be able to access features such as setting the point sprite size, you need to create your own custom vertex definition in your XNA code. This ensures that the data sent from your XNA project is compatible with your vertex shader inputs. This will allow you to size your point sprite from your XNA code. The vertex shader input for a point sprite still receives color and position informa- tion from your XNA application—as has been done in previous shader examples. However, the texture coordinate mapping is automatic, so you don’t need to set UV
  19. 326 MICROSOFT XNA GAME STUDIO CREATOR’S GUIDE coordinates in your XNA code or send them to the vertex shader. For you to pass the point sprite size to your vertex shader from your XNA code, the size variable defined for your vertex shader input must be tagged with the PSIZE semantic: struct VSinput{ float4 position : POSITION0; float4 color : COLOR0; float1 size : PSIZE0; }; The output from the vertex shader is also different from the shader code that is used for texturing objects. When we’re texturing primitive surfaces, the output from our vertex shader includes elements for color, position, and texture data. When im- plementing point sprites, the vertex shader also outputs the size element, and this must be denoted by the PSIZE semantic. You actually have to invent some data for the texture coordinate, which might seem weird, but some graphics cards require UV coordinates to exist when they leave the vertex shader. struct VSoutput{ float4 position : POSITION0; float1 size : PSIZE; float4 color : COLOR0; float4 UV : TEXCOORD0; }; Because of differences in point sprite handling on the Xbox 360 and the PC, we need to create a separate set of output specifically for the pixel shader. This allows you to use your shader on either platform. The Xbox 360 requires that UV coordi- nates be handled with a four-float vector denoted by a SPRITETEXCOORD semantic, and Windows requires the UV coordinates to be handled with a two-float vector de- noted by a TEXCOORD0 semantic. You may think having the extra output from the vertex shader is odd—and it is odd. However, being able to channel the VSoutput data to the graphics pipeline and the PSinput data to the pixel shader is necessary to run the same shader code on both your PC and Xbox 360. struct PSinput{ #ifdef XBOX float4 UV : SPRITETEXCOORD; #else float2 UV : TEXCOORD0; #endif float4 Color: COLOR0; };
  20. C H A P T E R 2 0 327 Particle Effects The vertex shader is similar to ones you’ve used before, but some extra values are added here to set up the point sprite for creating 3D fire in the next demonstration. The scale value, fade, reduces the size of each fire particle as it rises in the air and di- minishes before being regenerated. fade is also used to darken the color of the parti- cle as it rises away from the core of the fire. Handling the texture data output for a point sprite from the vertex shader is nota- bly different from a shader that just applies the usual texturing. The point sprite tex- ture is automatically applied to the point sprite when it is sent to the pixel shader, so it doesn’t matter what UV coordinates you set in the vertex shader. The data structure for the texture coordinate output from the vertex shader just needs to be in place. A vector with four floats is assigned to the texture output variable. This works both on the PC and on the Xbox. In the case of the PC, the vector will be truncated to a two-float vector. The pixel shader actually performs the final processing on the color and texture data. However, to appease graphics card differences, and to allow this code to run on the PC and Xbox, two separate output streams have been created. Depending on your graphics card, if you leave this code out, your point sprites may not appear on your PC. The VSoutput data is sent into the graphics pipeline, and the PSinput data is sent to the pixel shader for further processing of color and UV data. To ensure that the point sprite is sized properly, the following equation is used: OUT.size =(IN.size* (projection._m11/OUT.position.w)*(viewportHeight/2)) *fade; As you can see, this size routine veers away from the standard sizing methods that you might expect. The point sprite size is processed on the graphics card and it relies on the distance from the camera (projectedPosition.w), scaling for changes to the camera’s field of view (projection._m11), and the viewport height. By viewport we mean the section of window that is set for each player in a full screen or split-screen environment. Also note that the vertex shader’s output position value is used after the world-view-projection matrix has been applied to it. void VertexShader(in VSinput IN, out VSoutput OUT){ OUT.position = mul(IN.position, wvpMatrix); // projection._mll - Scaling information from camera's projection // to adjust to changing fields of view. // projectedPosition.w - Distance from the camera. // viewportHeight - Height of the current projection in the // window. See Chapter 28, "Multiplayer // Gaming," for more detail.
Đồng bộ tài khoản