`
univasity
  • 浏览: 811611 次
  • 性别: Icon_minigender_1
  • 来自: 广州
社区版块
存档分类
最新评论

[JSR-184][3D编程指南]Part III: Particle systems and immediate mode rendering (2)

    博客分类:
  • J2me
阅读更多

<!-- 整理收集自网络,收藏以便日后查阅 -->

 

<-- 续

The IndexBuffer
The M3G system stores face (triangle) information in a class called the IndexBuffer. We talked about such a class in the second part of the tutorial so you should know what it does, but I will refresh your memory anyway. The IndexBuffer holds indices in the array of vertrices and describes the faces (triangles) that compose our model. Since our model is a simple plane (a quad) it consists of two triangles. Since the IndexBuffer class is an abstract class, we need to use a class that extends the IndexBuffer. This class is the TriangleStripArray. As you can tell by its name, it stores triangles in an array, nifty eh? Now, there are two ways to describe triangles in a TriangleStripArray: explicit and implicit. The explicit description is the most common one, since you get to define the vertex coordinates that the triangles consist of. However, the implicit form can also be very useful. The difference is that in explicit form you need to supply an array that holds all the triangle coordinates, while in the implicit form you just supply the first vertex of the first triangle and the TriangleStripArray calculates the rest, assuming that each next vertex is one greater than the previous.

In this example we will use the explicit form for simplicity's sake. The TriangleStripArray demands that triangles in an array are defined in a special way, which is that you only define three indices for the first triangle, and then only define one more for each triangle to come. This is what's generally called a triangle strip. How does this work, you might wonder? Well as you know, stacking triangles next to each other in a model (such as our simple plane) results in the fact that all triangles share two points with another triangle (remember our cube from tutorial 2 ?). So this means that we can actually define a triangle by using two points from the previous triangle, and one extra point. You probably remember from our plane coordinates that we created our plane counter-clockwise starting from the point (-1, -1, 0). So these are the triangles we want to create:

As you see, we cut the mesh in two from point 3 to point 1, leaving us two triangles. These triangles are the (0, 1, 3) and the (1, 3, 2). Can you already see the pattern? The first triangle actually shares points 1 and 3 with the second triangle, so we can describe them like this: (0, 1, 3, 2) and the TriangleStripArray will understand that we have two triangles that share the middle indices. Let's see how we will do this in code.

// Create indices and face lengths
int indices[] = new int[] {0, 1, 3, 2};
int[] stripLengths = new int[] {4};
       
// Create the model's triangles
triangles = new TriangleStripArray(indices, stripLengths);
So the index array is exactly what we talked about, it holds our triangles. The stripLengths array is something I haven't mentioned yet but it's fairly simple. It just defines the length (in indices) of a strip. Since we define two triangles, our length in indices will be four. This is a useful variable for defining multiple-length strips in a single TriangleStripArray but we won't worry about it today. It will be a topic in later tutorials. Just know that today it's four. So, having all this we can create the TriangleStripArray easily by using the explicit mode constructor, which takes an array (the array of triangles) and the stripLengths array. Very easy!
All that is left for us to do now, is to create the Appearance class, which basically tells the M3G system how to render the faces contained within the VertexBuffer and IndexBuffer classes.
Appearance
The Appearance class is a great class that holds a lot of information on how to render the model. Alpha blending, textures, culling, you name it. We will need one of these if we are to create our model. Today we'll only use a few of the functions. First let's start with culling. Culling is a technique used to speed up rendering by telling the system that it shouldn't render certain parts of a model, such as its back or its front. For a particle engine for instance, we really never want to see the back of a particle so we can always use back culling on our particles. The available culling methods are these:
PolygonMode.CULL_BACK
PolygonMode.CULL_FRONT
PolygonMode.CULL_NONE
The CULL_BACK defines back culling, meaning that the back of the polygon will never be drawn. The cull front does the same, but for the front of the polygon. The last one disables culling altogether. The culling settings are available in a class called the PolygonMode which is a component of the Appearance class. So in order to set culling to anything, using a variable called cullFlags we can do the following:
Appearance appearance = new Appearance();
PolygonMode pm = new PolygonMode();
pm.setCulling(cullFlags);
appearance.setPolygonMode(pm);
Wasn't that really, really easy? Yeah, it was. Now we have defined a culling form (usually it's CULL_BACK). Next thing on the agenda is creating a texture and setting it into the Appearance class (remember, Appearance also stores texture information). This is done in a very simple manner by creating an Image2D that holds the image. We've already talked about how to do this earlier in this tutorial, so I won't cover it again. However, the Image2D is stored in a class called Texture2D and that's the class I will be explaining.
Textures
In order to create a texture that a model can use, you need two things; a Image2D containing the picture data and a Texture2D class. The Texture2D class holds all relevant texture information, including image data and texture transformation that transforms our cookie cutter, remember? It even holds other useful things such as blending (alpha blending), wrapping (if the texture wraps across the polygon) and filtering (the quality of the texture projection). We will be using all of the above today, but the blending will be explained a bit later. Let's start step by step. First we create a Image2D and the corresponding Texture2D object:
// Open image
Image texImage = Image.createImage(texFilename);
Texture2D theTexture = new Texture2D(new Image2D(Image2D.RGBA, texImage));
That was fairly simple and it really needs no explanation. You create the Image2D and just hand it to the Texture2D's constructor. Now let's talk about blending.
You can blend surfaces with the color of your polygon (yes, models can have colors) in many different ways. I won't go into detail of all the different blending methods, instead I'll ask you to check out the M3G API documentation and the documentation of the Texture2D class. Just know that there are various ways to blend your texture, according to alpha and color, with the underlying color of the model itself. For no blending at all, we'd do the following:
// Replace the mesh's original colors (no blending)
theTexture.setBlending(Texture2D.FUNC_REPLACE);
The above piece of code actually tells the system to ignore all underlying colors of the model and is very useful for many things, since it makes the transparent parts of the texture image transparent on the model as well! Nice, isn't it? You won't be able to alpha-blend the texture with the model this way though. We will blend textures later on in this tutorial, but for now I want you to know about the above method.
The last two things we will do are the wrapping and filtering. These are fairly simple procedures and I won't go into detail. Again, if you want to see what different kinds of filtering and wrapping that's available to you, check the M3G API documentation. Here, we use no wrapping, meaning that the texture is drawn only once onto the polygon, and we use the simplest filtering which produces the lowest quality but also the highest speed. Remember, devices that implement M3G are still far away from today's graphic card performance on consoles and PC, so we have to exchange quality for speed most of the time.
// Set wrapping and filtering
theTexture.setWrapping(Texture2D.WRAP_CLAMP, Texture2D.WRAP_CLAMP);
theTexture.setFiltering(Texture2D.FILTER_BASE_LEVEL, Texture2D.FILTER_NEAREST);
Now all that's left is to actually add the texture to the Appearance. Yes, I said add and not set. This is because a model can actually have multiple textures! This is a very nice technique for many things like animation, lightmapping, etc. They are however advanced tutorials and we won't be covering them today. Instead I'll go into detail on those subjects in a later part of the series. To add a texture, you simply use the setTexture method (yes, dumb name) and supply an index and a Texture2D. So to add only one texture, you do this:
// Add texture to the appearance
appearance.setTexture(0, theTexture);
All right, we're all done now! We have created our appearance and all other things vital to a Mesh. All that's left now is to actually compose the Mesh from the parts we've just created.
The Mesh
Creating a Mesh is very simple. All we need to do is supply the parts we just created and compose it. This is done with a single line of code, like this:
// Finally create the Mesh
Mesh mesh = new Mesh(vertexBuffer, triangles, appearance);
That's it! Now we have a textured plane that we can render into our scene.
The first impression of the above method to create a Mesh might strike you as very complicated, but it actually isn't. Just use it a few times and you'll realize that it's actually very easy and fast, but also very intuitive. Now we can render our Mesh using immediate mode rendering, which I've already talked about, and supply a Transform matrix to place it somewhere in 3D-space.
Before we start coding away on a ParticleEffect class that will do something cool, let's first talk a bit about alpha blending of textures and polygons.
Alpha Blending
Often you want to blend a model and its texture to its surroundings, creating semi-transparent models. This can be used for many different things, such as windows, water surface, particle engines, etc. The list is actually endless and only your imagination sets the limits. In M3G this process is very simple and today I will show you how to do it in our particle engine. We want our particles to fade away and become more and more transparent as they progress in their life and completely disappear when they die, so we will need alpha blending for this. This is done in two steps.
The first step of setting alpha blending to a Mesh is actually telling it how much it should fade, by giving it an alpha value AND a color of the model. This is done in the Mesh's VertexBuffer class, remember that? It's a method called setDefaultColor and is used for flat-coloring (meaning giving the entire model the same color). M3G also supports per-vertex colors and smooth color shading, which means you can give different parts of the model different colors, but we won't be doing that today. Instead, we just want to supply a single color with an alpha value to the entire model and blend it with its surroundings. To do this we just supply a color in 0xAARRGGBB format to the setDefaultColor method.
mesh.getVertexBuffer().setDefaultColor(0x80FF0000);
The above line of code will take the Mesh mesh and make it semi transparent (an alpha value of 0x80, or 128 which is half-transparent) with the color bright red (0xFF or 255 red). If the Mesh didn't have a texture, it would now be a bright red model that is semi-transparent. However, we also want to blend our texture into the model. Why? Well, if we blend our texture as well, we can actually change the color of the texture to anything we want, without using multiple textures. For a particle engine this is very useful since we use a single texture and just blend it with the Mesh's default color to get particles of all different colors. Nifty, isn't it? So, to blend a texture we have to go back to the Appearance class and one of its attributes, the CompositingMode.

The CompositingMode tells the Appearance class exactly how to composit (or blend) the model with its surroundings. There are a number of ways to blend a model with its surroundings and all produce a bit different results. We will be using the most common and simplest method, ALPHA blending. This is how you would set the CompositingMode to ALPHA blending and add it to a Mesh's Appearance.
CompositingMode cm = new CompositingMode();
cm.setBlending(CompositingMode.ALPHA);
m.getAppearance(0).setCompositingMode(cm);
See how easy that was? If you want to know about the other blending modes (ALPHA_ADD, MODULATE, MODULATE_X2 and REPLACE) check the M3G API documentation of the CompositingMode class.
We're almost done setting up our blending. All that's left is to modify the texture. Remember when we created the texture and set it to blend with FUNC_REPLACE that completely ignored the underlying Mesh's color and transparency? Well, we can't use that to blend our texture to our Mesh, so it has to be changed. There are three ways to blend a texture to the underlying model and they are FUNC_REPLACE (which we've already seen), FUNC_BLEND (that actually blends), FUNC_DECAL and FUNC_MODULATE which are special modes you'll have to read about in the API documentation or wait for a later part of this tutorial series. Today we'll be using FUNC_BLEND. It will simply blend our texture (and the texture's alpha values) with the model's color and alpha value. To do this, we need to retrieve a mesh's texture (or textures if we have many) and set the correct blending. To fetch the first texture of a model (which we will do today) we do like this:
m.getAppearance(0).getTexture(0).setBlending(textureBlending);
See, it's that easy! Now we've set up blending for our model and can display it in a variety of colors and transparency. Before we do anything though, let's talk about the kind of texture image we need for blending.
Blending Image
To blend a texture image with the underlying model, you can't use any kind of texture. Well, actually you can but you won't get the desired results. For instance; if the texture is fully white, it won't be able to be blended with the texture, since we use an algorithm that adds the color of the model and the color of the texture. This is a problem for white textures since white is the highest color (255) and no matter how much you add to it, you won't be able to produce anything but white. For best blending and complete color control with the Mesh's default color, use a completely black texture. This way you can set whatever color you want to the Mesh's default color, and the texture will be of the same color as well.

Here is a picture of the texture we'll use in the game. As you can see, it's completely black and blurs out to create a nice and smooth effect. You don't have to use black if you don't want to, but just remember that the color of the texture is always blended with the color of the Mesh, so you might get a completely different color out, than you had in mind.

Controlling the Alpha
Now, to actually control the alpha (transparency) of the model and texture, you just manipulate the Mesh's default color (which is in the Mesh's VertexBuffer, remember?). The lower the Alpha value is in the default color, the more transparent the Mesh will be. To construct a color with a specific alpha value, you could do this:
int color = (alpha << 24) | (red << 16) | (green << 8) | blue;
The above piece of code will combine an alpha component with a red, green and blue color and create the type of color that the VertexBuffer expects in its setDefaultColor method. So by changing the alpha variable, you also change the transparency. You'll see a practical example of this later on.
Finishing the Particle System
Now that we can actually create Meshes from code, let's create a utility function that will do it for us. First let's make something that'll make any Mesh blendable. It could look something like this:
/** Sets the alpha blending of a mesh. Only meaningful if the mesh already is alpha blended */
public static void setMeshAlpha(Mesh m, int alpha)
{
m.getVertexBuffer().setDefaultColor(alpha);
}

/**
*
* @param m The mesh to convert to a blended one
* @param alpha The alpha color to blend with
* @param textureBlending The texture blending parameter.
*/
public static void convertToBlended(Mesh m, int alpha, int textureBlending)
{
// Set the alpha
setMeshAlpha(m, alpha);

// Fix the compositing mode
CompositingMode cm = new CompositingMode();
cm.setBlending(CompositingMode.ALPHA);
m.getAppearance(0).setCompositingMode(cm);
m.getAppearance(0).getTexture(0).setBlending(textureBlending);
}
Those two methods can convert any Mesh to be blendable with the rest of our scene. That's always nice to have. Now let's get a method that actually does the whole creating process:
/**
* Creates a textured plane.
* @param texFilename The name of the texture image file
* @param cullFlags The flags for culling. See PolygonMode.
* @return The finished textured mesh
*/
public static Mesh createPlane(String texFilename, int cullFlags)
{
// The vertrices of the plane
short vertrices[] = new short[] {-1, -1, 0,
1, -1, 0,
1, 1, 0,
-1, 1, 0};
// Texture coords of the plane
   short texCoords[] = new short[] {0, 255,
   255, 255,
   255, 0,
   0, 0};

   // The classes
   VertexArray vertexArray, texArray;
   IndexBuffer triangles;
// Create the model's vertrices
   vertexArray = new VertexArray(vertrices.length/3, 3, 2);
   vertexArray.set(0, vertrices.length/3, vertrices);
  
   // Create the model's texture coords
   texArray = new VertexArray(texCoords.length / 2, 2, 2);
   texArray.set(0, texCoords.length / 2, texCoords);
  
   // Compose a VertexBuffer out of the previous vertrices and texture coordinates
   VertexBuffer vertexBuffer = new VertexBuffer();
   vertexBuffer.setPositions(vertexArray, 1.0f, null);
   vertexBuffer.setTexCoords(0, texArray, 1.0f/255.0f, null);
  
   // Create indices and face lengths
   int indices[] = new int[] {0, 1, 3, 2};
   int[] stripLengths = new int[] {4};
  
   // Create the model's triangles
   triangles = new TriangleStripArray(indices, stripLengths);
// Create the appearance
   Appearance appearance = new Appearance();
   PolygonMode pm = new PolygonMode();
   pm.setCulling(cullFlags);
   appearance.setPolygonMode(pm);
// Create and set the texture
   try
   {
   // Open image
   Image texImage = Image.createImage(texFilename);
   Texture2D theTexture = new Texture2D(new Image2D(Image2D.RGBA, texImage));
  
   // Replace the mesh's original colors (no blending)
   theTexture.setBlending(Texture2D.FUNC_REPLACE);
  
   // Set wrapping and filtering
   theTexture.setWrapping(Texture2D.WRAP_CLAMP, Texture2D.WRAP_CLAMP);
   theTexture.setFiltering(Texture2D.FILTER_BASE_LEVEL, Texture2D.FILTER_NEAREST);
// Add texture to the appearance
   appearance.setTexture(0, theTexture);
}
   catch(Exception e)
   {
   // Something went wrong
   System.out.println("Failed to create texture");
   System.out.println(e);
   }
  
   // Finally create the Mesh
   Mesh mesh = new Mesh(vertexBuffer, triangles, appearance);
// All done
   return mesh;
   }
That's a lot of code, but it's not really that hard. Check it out and then backtrack to the part of the tutorial where I explained how to create a Mesh from code, and you'll realize it's simple.
Now that we can create Meshes, we need to create a ParticleEffect class to use with our ParticleSystem and get some kind of effect. I am aiming for a fountain effect that you can rotate around the screen as much as you want. A good exercise after this tutorial would be to create your own ParticleEffect class that does some kind of cool effect. Try creating a circular explosion (like fireworks) or maybe even something as advanced as a flickering flame? Anyhow, first we'll use some nice Object Oriented Programming to abstract behavior that many ParticleEffects will have; using a flat Mesh as a particle. This is how that class will look like:
/**
* Represents a particle effect that uses a bitmap.
*/
public abstract class BitmapParticleEffect implements ParticleEffect
{
// The mesh
Mesh mesh = null;

// The transformation matrix
Transform trans = new Transform();

// The scale
float scale = 1.0f;

/** Initializes the bitmap used to render particles */
public BitmapParticleEffect(String filename, float scale)
{
// Load the plane with the wanted texture
mesh = MeshFactory.createAlphaPlane(filename, PolygonMode.CULL_BACK, 0xffffffff);

// Make sure we set the scale
this.scale = scale;
}

/**
* @see ParticleEffect#render(Particle, Graphics3D)
*/
public void render(Particle p, Graphics3D g3d)
{
// Calculate the alpha
int alpha = (int)(255 * p.getLife());

// Create the color
int color = p.getColor() | (alpha << 24);

// Set alpha
MeshOperator.setMeshAlpha(mesh, color);

// Transform
trans.setIdentity();
trans.postScale(scale, scale, scale);
float[] pos = p.getPos();
trans.postTranslate(pos[0], pos[1], pos[2]);

// Render
g3d.render(mesh, trans);
}
}
As you can see, it is an abstract class that already fixes a lot for us, such as the immediate mode rendering. Check out the render method and see if you recognize what I talked about earlier in this tutorial. As you can see, we also set the alpha of the Mesh in respect to the particle's life. Since each particle already has a color in the format 0xRRGGBB all we do is insert the alpha value and set it as the Mesh's default color. The MeshFactory.createAlphaPlane method that is used in the constructor is very simple. It first calls our createPlane method that I showed earlier, and then uses our utility functions to make the Mesh blendable.
The above class does a lot of the work for us so all we need to do now is actually make the particles move around. I chose to create a fountain effect and here is the result:
/**
* Creates a nice fountain effect for the particles, that shoots particles
* in a certain direction, determined by its angle. The angle can be changed in real-time.
*/
public class FountainEffect extends BitmapParticleEffect
{
// The angle of particle emission
private int angle = 90;

// The sine and cosine of the current angle
private float[] trig = {1.0f, 0.0f};

// The emitting origin
private float[] pos = {0.0f, 0.0f, 0.0f};

// The randomizer
Random rand = null;

/**
* @param angle The angle of particle emission
*/
public FountainEffect(int angle)
{
// Init the bitmap
super("/res/particle.png", 0.05f);

// Set the angle
setAngle(angle);

// Get randomizer
rand = new Random();
}
/**
   * @see ParticleEffect#init(Particle)
   */
   public void init(Particle p)
   {
   // Set the particle's life
   p.setLife(1.0f);
  
   // Set the particle's position
   p.setPos(pos);
  
   // Create the particle's velocties
   float[] vel = new float[3];
  
   // We want velocities from 0.2f to 1.0f
   float xyvel = rand.nextFloat() * 0.8f + 0.2f;
  
   // We want the particle to die slowly
   p.setDegradation(xyvel / 18);
  
   // Set velocities according to trigonometry with a small deviation
   vel[0] = xyvel * trig[1] + rand.nextFloat() * 0.125f - 0.0625f;
   vel[1] = xyvel * trig[0] + rand.nextFloat() * 0.125f - 0.0625f;
  
   // No movement in depth
   vel[2] = 0.0f;
  
   // Set the velocity
   p.setVel(vel);
  
   // Set the random color
   int r = (int)(120 * rand.nextFloat()) + 135;
   int g = (int)(120 * rand.nextFloat()) + 135;
   int b = (int)(120 * rand.nextFloat()) + 135;
   int col = (r << 16) | (g << 8) | b;
   p.setColor(col);
   }
/**
   * @see ParticleEffect#update(Particle)
   */
   public void update(Particle p)
   {
   // Simply update position
   float[] ppos = p.getPos();
   float[] vel = p.getVel();
   ppos[0] += vel[0];
   ppos[1] += vel[1];
   ppos[2] += vel[2];
  
   // Update life
   p.setLife(p.getLife() - p.getDegradation());
  
   // Check life. If it is dead, we just reinit it
   if(p.getLife() < -0.001f)
   {
   init(p);
   }
   }
/**
   * @param angle The angle to set.
   */
   public void setAngle(int angle) {
   this.angle = angle;
   trig[0] = (float)Math.sin(Math.toRadians(angle));
   trig[1] = (float)Math.cos(Math.toRadians(angle));
   }
/**
   * @return Returns the angle.
   */
   public int getAngle() {
   return angle;
   }
/**
   * @param pos The pos to set.
   */
   void setEmittingOrigin(float[] pos) {
   this.pos = pos;
   }
/**
   * @return Returns the pos.
   */
   float[] getEmittingOrigin() {
   return pos;
   }
}
I won't go into detail here, since it is really basic trigonometry stuff. You can throw that class out and replace it with a class of your own that does some cool particle effect. All it does is to randomize a particle's velocity depending on the angle of the fountain, and then move it along until it dies. You can change the angle in real-time to be able to rotate the fountain. I've added this to the main game loop and it looks something like this:
// Check controls for fountain rotation
if(key[LEFT])
fx.setAngle(fx.getAngle() + 5);
if(key[RIGHT])
fx.setAngle(fx.getAngle() - 5);
The fx object is actually the FountainEffect reference that we change the angle of. As you see pressing left increases the angle counter-clockwise and right decreases the angle clockwise.
Setting up the Scene for Immediate Mode rendering
Now that we've done everything, I'll just show you how I've set up the immediate mode rendering. You should already know this since I went through it earlier in this part of the tutorial, so I'll just repeat the info. Here is how we construct the canvas:
/** Constructs the canvas
*/
public M3GCanvas(int fps)
{
// We don't want to capture keys normally
super(true);

// We want a fullscreen canvas
setFullScreenMode(true);

// Load our camera
loadCamera();

// Load our background
loadBackground();

// Set up graphics 3d
setUp();
}
The loadCamera method creates a camera that our 3D-system will use and is very simple. All it does is creates a default camera. You can see what it does in the code listing below. The loadBackground method creates a background and sets its color to black (0x0). Also very simple. The last method, the setUp finishes the initialization by actually adding an ambient light to our rendering context. It looks like this:
/** Prepares the Graphics3D engine for immediate mode rendering by adding a light */
private void setUp()
{
// Get the instance
g3d = Graphics3D.getInstance();

// Add a light to our scene, so we can see something
g3d.addLight(createAmbientLight(), identity);
}


/** Creates a simple ambient light */
private Light createAmbientLight()
{
Light l = new Light();
l.setMode(Light.AMBIENT);
l.setIntensity(1.0f);
return l;
}
Nothing new there really. We've created ambient lights many times before and you should know this by heart.
So, now our scene is ready for rendering and all we need to do is emit some particles onscreen. As I already mentioned, this is done by calling the emit method on the ParticleSystem object. We create a ParticleSystem, add a ParticleEffect to it (our FountainEffect) and then we just call emit each iteration of the game loop. This is what I placed into the main game loop:
// Envelop all in a try/catch block just in case
try
{
// Get the Graphics3D context
g3d = Graphics3D.getInstance();

// First bind the graphics object. We use our pre-defined rendering hints.
g3d.bindTarget(g, true, RENDERING_HINTS);

// Clear background
g3d.clear(back);

// Bind camera at fixed position in origo
g3d.setCamera(cam, identity);

// Init particles
if(ps == null)
{
fx = new FountainEffect(90);
ps = new ParticleSystem(fx, 20);
}

// Emit the particles
ps.emit(g3d);

// Check controls for fountain rotation
if(key[LEFT])
fx.setAngle(fx.getAngle() + 5);
if(key[RIGHT])
fx.setAngle(fx.getAngle() - 5);

// Quit if user presses fire
if(key[FIRE])
TutorialMidlet.die();
}
catch(Exception e)
{
reportException(e);
}
finally
{
// Always remember to release!
g3d.releaseTarget();
}
Let's dissect that code now to understand it. First it's pretty basic, we get the instance of the Graphics3D object and bind it to our Graphics object (the above method is the draw(Graphics g) method, so we already have the Graphics object). After this, we clear the background and depth buffer by calling g3d.clear on our Background object. This gives us a clean, black canvas. Now we need to set our camera into the world with its own transform (it's clumsy to do this every iteration of the game loop, but I'm doing it here for clarity). We'll always use the identity transform, since we don't want to move the camera in this tutorial. Next, we check if our particle system still is null, and init it if necessary. We just create a nice fountain effect that starts at the angle of 90 degrees (pointing straight upwards) and a particle system that holds 20 particles. After this is done we call the emit method and the particles will get updated and rendered into our world. When all is done we check for some keys, such as joystick keys for rotation of our nice particle system and the Fire key for exiting the application.
There! It wasn't that hard now was it? As I said, to practice you can actually replace the FountainEffect class with something you wrote yourself, to get a personal particle effect. You can also try loading a lot of different Meshes into memory with a lot of different textures, since now you have the methods that do the work for you.
Conclusion
So to wrap it all up, here are a few screenshots of the code in action. Pretty, pretty colors!


There's our particle system! Note how pretty it is when the particles actually fade away, the older they get. I also chose to randomize the colors of the particles from 128 to 255, so that we don't get any dark and moody colors. We just want warm and bright happiness! Here is the code listing. You can even download the full source package including the resources at the bottom of the document.

------------------------------------------------------
源码见附件:
分享到:
评论

相关推荐

    [JSR-184][3D编程指南]Part II: Light 3D theory and orientation

    【JSR-184】3D编程指南第二部分:光照3D理论与定向 在3D编程领域,理解和应用光照理论以及物体的定向是至关重要的。JSR-184,全称Java 3D API,是Java平台上的一个标准,它提供了用于创建和操作3D图形的强大工具。...

    [JSR-184][3D编程指南]Part IV:M3G built-in collision,light physics and camera perspec

    总之,【JSR-184】【3D编程指南】Part IV探讨了移动3D图形中的关键元素,包括碰撞检测、光照物理和相机视角控制,这些是创建生动、交互性3D应用的基础。通过学习和实践,开发者可以利用JSR-184在移动设备上创建...

    [JSR-184][3D编程指南]Part I: Quick jump into the world of Mobile Java 3D programming

    【JSR-184】是Java Micro Edition (Java ME)平台中的一项规范,全称为“Mobile 3D Graphics API”。这个规范旨在为移动设备提供3D图形编程接口,使得开发者能够在小型设备上构建丰富的三维应用程序,如游戏、虚拟...

    [JSR-184][3D编程指南]Part V: Heightmap terrain rendering using M3G

    总之,【JSR-184】的3D编程指南Part V专注于使用M3G在移动设备上实现基于高度图的地形渲染,涉及图像处理、3D网格构建、纹理映射、光照以及移动平台的图形渲染技术。通过实践和理解这些概念,你将能够创建出逼真的3D...

    jsr-275-1.0-beta-2.jar

    java.lang.ClassNotFoundException: javax.measure.converter.ConversionException所需的jar

    undertow-websockets-jsr-2.1.7.Final-API文档-中文版.zip

    赠送jar包:undertow-websockets-jsr-2.1.7.Final.jar; 赠送原API文档:undertow-websockets-jsr-2.1.7.Final-javadoc.jar; 赠送源代码:undertow-websockets-jsr-2.1.7.Final-sources.jar; 赠送Maven依赖信息...

    undertow-websockets-jsr-2.1.7.Final-API文档-中英对照版.zip

    赠送jar包:undertow-websockets-jsr-2.1.7.Final.jar; 赠送原API文档:undertow-websockets-jsr-2.1.7.Final-javadoc.jar; 赠送源代码:undertow-websockets-jsr-2.1.7.Final-sources.jar; 赠送Maven依赖信息...

    jsr311-api-1.1.1-API文档-中文版.zip

    赠送jar包:jsr311-api-1.1.1.jar; 赠送原API文档:jsr311-api-1.1.1-javadoc.jar; 赠送源代码:jsr311-api-1.1.1-sources.jar; 赠送Maven依赖信息文件:jsr311-api-1.1.1.pom; 包含翻译后的API文档:jsr311-api...

    mmapi(jsr-135)编程指导

    **JSR-135编程指导** JSR-135,全称为JavaTM Media Framework API,是Java ME(J2ME)平台中用于多媒体应用开发的重要规范。它为移动和嵌入式设备提供了处理音频、视频和图像的能力,使得开发者能够创建功能丰富的...

    undertow-websockets-jsr-2.2.14.Final-API文档-中英对照版.zip

    赠送jar包:undertow-websockets-jsr-2.2.14.Final.jar; 赠送原API文档:undertow-websockets-jsr-2.2.14.Final-javadoc.jar; 赠送源代码:undertow-websockets-jsr-2.2.14.Final-sources.jar; 赠送Maven依赖信息...

    undertow-websockets-jsr-2.2.14.Final-API文档-中文版.zip

    赠送jar包:undertow-websockets-jsr-2.2.14.Final.jar; 赠送原API文档:undertow-websockets-jsr-2.2.14.Final-javadoc.jar; 赠送源代码:undertow-websockets-jsr-2.2.14.Final-sources.jar; 赠送Maven依赖信息...

    JSR-184.zip_JSR184 3D_jsr184

    用jsr184编写的手机3d编程实例,用户可以任意旋转箭头,放大缩小等等。包含如何使用数据定义mesh,如何操作camera如何旋转等等,程序功能较繁杂,但是界面较粗糙(数据定义的模型当然是越简单越好啦),学习意义大于...

    3-D_Game_Development_on_JSR-184_v1_0_3

    《3-D Game Development on JSR-184 v1_0_3》是关于使用Java 3D技术在J2ME平台上开发3D游戏的一份重要资料,它为初学者提供了一个宝贵的入门教程。JSR-184,全称为Java ME 3D API,是Java Micro Edition(J2ME)平台...

    jackson-datatype-jsr310-2.12.5-API文档-中文版.zip

    赠送jar包:jackson-datatype-jsr310-2.12.5.jar; 赠送原API文档:jackson-datatype-jsr310-2.12.5-javadoc.jar; 赠送源代码:jackson-datatype-jsr310-2.12.5-sources.jar; 赠送Maven依赖信息文件:jackson-...

    jackson-datatype-jsr310-2.11.4-API文档-中英对照版.zip

    赠送jar包:jackson-datatype-jsr310-2.11.4.jar; 赠送原API文档:jackson-datatype-jsr310-2.11.4-javadoc.jar; 赠送源代码:jackson-datatype-jsr310-2.11.4-sources.jar; 赠送Maven依赖信息文件:jackson-...

    JSR-303接口标准和参考实现

    在实际开发中,我们通常会将这两个库添加到项目的类路径中,然后通过配置和编程API来启用和使用JSR-303验证。例如,在Spring框架中,可以通过在配置文件中启用注解驱动的验证,或者在代码中创建Validator实例并调用...

    webbeans--jsr-299

    标题:WebBeans -- JSR-299 描述:WebBeans是Gavin King的力作,专注于为Java EE平台提供上下文与依赖注入(Contexts and Dependency Injection)。 ### WebBeans (JSR-299) 知识点详解 #### 一、架构与合同 Web...

    jackson-datatype-jsr310-2.11.4-API文档-中文版.zip

    赠送jar包:jackson-datatype-jsr310-2.11.4.jar; 赠送原API文档:jackson-datatype-jsr310-2.11.4-javadoc.jar; 赠送源代码:jackson-datatype-jsr310-2.11.4-sources.jar; 赠送Maven依赖信息文件:jackson-...

Global site tag (gtag.js) - Google Analytics