- 浏览: 811611 次
- 性别:
- 来自: 广州
最新评论
-
mixture:
语句int num1, num2;的频度为1;语句i=0;的频 ...
算法时间复杂度的计算 [整理] -
zxjlwt:
学习了。http://surenpi.com
[问题解决]Error: ShouldNotReachHere() [整理] -
Animal:
谢谢 楼主 好东西
算法时间复杂度的计算 [整理] -
univasity:
gaidandan 写道缓存失败,,模拟器上可以缓存,同样代码 ...
[开发总结]WebView使用中遇到的一些问题&解决 -
blucelee2:
那么麻烦干吗,而且这种方法会导致,当拉太小的时候样式会丢掉,整 ...
[SWT]SashForm中固定单侧大小(&实现面板隐藏)
[JSR-184][3D编程指南]Part III: Particle systems and immediate mode rendering (2)
- 博客分类:
- J2me
<!-- 整理收集自网络,收藏以便日后查阅 -->
<-- 续
The M3G system stores face (triangle) information in a class called the IndexBuffer. We talked about such a class in the second part of the tutorial so you should know what it does, but I will refresh your memory anyway. The IndexBuffer holds indices in the array of vertrices and describes the faces (triangles) that compose our model. Since our model is a simple plane (a quad) it consists of two triangles. Since the IndexBuffer class is an abstract class, we need to use a class that extends the IndexBuffer. This class is the TriangleStripArray. As you can tell by its name, it stores triangles in an array, nifty eh? Now, there are two ways to describe triangles in a TriangleStripArray: explicit and implicit. The explicit description is the most common one, since you get to define the vertex coordinates that the triangles consist of. However, the implicit form can also be very useful. The difference is that in explicit form you need to supply an array that holds all the triangle coordinates, while in the implicit form you just supply the first vertex of the first triangle and the TriangleStripArray calculates the rest, assuming that each next vertex is one greater than the previous.
In this example we will use the explicit form for simplicity's sake. The TriangleStripArray demands that triangles in an array are defined in a special way, which is that you only define three indices for the first triangle, and then only define one more for each triangle to come. This is what's generally called a triangle strip. How does this work, you might wonder? Well as you know, stacking triangles next to each other in a model (such as our simple plane) results in the fact that all triangles share two points with another triangle (remember our cube from tutorial 2 ?). So this means that we can actually define a triangle by using two points from the previous triangle, and one extra point. You probably remember from our plane coordinates that we created our plane counter-clockwise starting from the point (-1, -1, 0). So these are the triangles we want to create: As you see, we cut the mesh in two from point 3 to point 1, leaving us two triangles. These triangles are the (0, 1, 3) and the (1, 3, 2). Can you already see the pattern? The first triangle actually shares points 1 and 3 with the second triangle, so we can describe them like this: (0, 1, 3, 2) and the TriangleStripArray will understand that we have two triangles that share the middle indices. Let's see how we will do this in code. |
int indices[] = new int[] {0, 1, 3, 2};
int[] stripLengths = new int[] {4};
// Create the model's triangles
triangles = new TriangleStripArray(indices, stripLengths);
The Appearance class is a great class that holds a lot of information on how to render the model. Alpha blending, textures, culling, you name it. We will need one of these if we are to create our model. Today we'll only use a few of the functions. First let's start with culling. Culling is a technique used to speed up rendering by telling the system that it shouldn't render certain parts of a model, such as its back or its front. For a particle engine for instance, we really never want to see the back of a particle so we can always use back culling on our particles. The available culling methods are these:
PolygonMode.CULL_FRONT
PolygonMode.CULL_NONE
PolygonMode pm = new PolygonMode();
pm.setCulling(cullFlags);
appearance.setPolygonMode(pm);
In order to create a texture that a model can use, you need two things; a Image2D containing the picture data and a Texture2D class. The Texture2D class holds all relevant texture information, including image data and texture transformation that transforms our cookie cutter, remember? It even holds other useful things such as blending (alpha blending), wrapping (if the texture wraps across the polygon) and filtering (the quality of the texture projection). We will be using all of the above today, but the blending will be explained a bit later. Let's start step by step. First we create a Image2D and the corresponding Texture2D object:
Image texImage = Image.createImage(texFilename);
Texture2D theTexture = new Texture2D(new Image2D(Image2D.RGBA, texImage));
theTexture.setBlending(Texture2D.FUNC_REPLACE);
theTexture.setWrapping(Texture2D.WRAP_CLAMP, Texture2D.WRAP_CLAMP);
theTexture.setFiltering(Texture2D.FILTER_BASE_LEVEL, Texture2D.FILTER_NEAREST);
appearance.setTexture(0, theTexture);
Creating a Mesh is very simple. All we need to do is supply the parts we just created and compose it. This is done with a single line of code, like this:
Mesh mesh = new Mesh(vertexBuffer, triangles, appearance);
Before we start coding away on a ParticleEffect class that will do something cool, let's first talk a bit about alpha blending of textures and polygons.
Often you want to blend a model and its texture to its surroundings, creating semi-transparent models. This can be used for many different things, such as windows, water surface, particle engines, etc. The list is actually endless and only your imagination sets the limits. In M3G this process is very simple and today I will show you how to do it in our particle engine. We want our particles to fade away and become more and more transparent as they progress in their life and completely disappear when they die, so we will need alpha blending for this. This is done in two steps.
The CompositingMode tells the Appearance class exactly how to composit (or blend) the model with its surroundings. There are a number of ways to blend a model with its surroundings and all produce a bit different results. We will be using the most common and simplest method, ALPHA blending. This is how you would set the CompositingMode to ALPHA blending and add it to a Mesh's Appearance.
cm.setBlending(CompositingMode.ALPHA);
m.getAppearance(0).setCompositingMode(cm);
To blend a texture image with the underlying model, you can't use any kind of texture. Well, actually you can but you won't get the desired results. For instance; if the texture is fully white, it won't be able to be blended with the texture, since we use an algorithm that adds the color of the model and the color of the texture. This is a problem for white textures since white is the highest color (255) and no matter how much you add to it, you won't be able to produce anything but white. For best blending and complete color control with the Mesh's default color, use a completely black texture. This way you can set whatever color you want to the Mesh's default color, and the texture will be of the same color as well.
Here is a picture of the texture we'll use in the game. As you can see, it's completely black and blurs out to create a nice and smooth effect. You don't have to use black if you don't want to, but just remember that the color of the texture is always blended with the color of the Mesh, so you might get a completely different color out, than you had in mind. |
Now, to actually control the alpha (transparency) of the model and texture, you just manipulate the Mesh's default color (which is in the Mesh's VertexBuffer, remember?). The lower the Alpha value is in the default color, the more transparent the Mesh will be. To construct a color with a specific alpha value, you could do this:
Now that we can actually create Meshes from code, let's create a utility function that will do it for us. First let's make something that'll make any Mesh blendable. It could look something like this:
public static void setMeshAlpha(Mesh m, int alpha)
{
m.getVertexBuffer().setDefaultColor(alpha);
}
/**
*
* @param m The mesh to convert to a blended one
* @param alpha The alpha color to blend with
* @param textureBlending The texture blending parameter.
*/
public static void convertToBlended(Mesh m, int alpha, int textureBlending)
{
// Set the alpha
setMeshAlpha(m, alpha);
// Fix the compositing mode
CompositingMode cm = new CompositingMode();
cm.setBlending(CompositingMode.ALPHA);
m.getAppearance(0).setCompositingMode(cm);
m.getAppearance(0).getTexture(0).setBlending(textureBlending);
}
* Creates a textured plane.
* @param texFilename The name of the texture image file
* @param cullFlags The flags for culling. See PolygonMode.
* @return The finished textured mesh
*/
public static Mesh createPlane(String texFilename, int cullFlags)
{
// The vertrices of the plane
short vertrices[] = new short[] {-1, -1, 0,
1, -1, 0,
1, 1, 0,
-1, 1, 0};
// Texture coords of the plane
short texCoords[] = new short[] {0, 255,
255, 255,
255, 0,
0, 0};
// The classes
VertexArray vertexArray, texArray;
IndexBuffer triangles;
// Create the model's vertrices
vertexArray = new VertexArray(vertrices.length/3, 3, 2);
vertexArray.set(0, vertrices.length/3, vertrices);
// Create the model's texture coords
texArray = new VertexArray(texCoords.length / 2, 2, 2);
texArray.set(0, texCoords.length / 2, texCoords);
// Compose a VertexBuffer out of the previous vertrices and texture coordinates
VertexBuffer vertexBuffer = new VertexBuffer();
vertexBuffer.setPositions(vertexArray, 1.0f, null);
vertexBuffer.setTexCoords(0, texArray, 1.0f/255.0f, null);
// Create indices and face lengths
int indices[] = new int[] {0, 1, 3, 2};
int[] stripLengths = new int[] {4};
// Create the model's triangles
triangles = new TriangleStripArray(indices, stripLengths);
// Create the appearance
Appearance appearance = new Appearance();
PolygonMode pm = new PolygonMode();
pm.setCulling(cullFlags);
appearance.setPolygonMode(pm);
// Create and set the texture
try
{
// Open image
Image texImage = Image.createImage(texFilename);
Texture2D theTexture = new Texture2D(new Image2D(Image2D.RGBA, texImage));
// Replace the mesh's original colors (no blending)
theTexture.setBlending(Texture2D.FUNC_REPLACE);
// Set wrapping and filtering
theTexture.setWrapping(Texture2D.WRAP_CLAMP, Texture2D.WRAP_CLAMP);
theTexture.setFiltering(Texture2D.FILTER_BASE_LEVEL, Texture2D.FILTER_NEAREST);
// Add texture to the appearance
appearance.setTexture(0, theTexture);
}
catch(Exception e)
{
// Something went wrong
System.out.println("Failed to create texture");
System.out.println(e);
}
// Finally create the Mesh
Mesh mesh = new Mesh(vertexBuffer, triangles, appearance);
// All done
return mesh;
}
* Represents a particle effect that uses a bitmap.
*/
public abstract class BitmapParticleEffect implements ParticleEffect
{
// The mesh
Mesh mesh = null;
// The transformation matrix
Transform trans = new Transform();
// The scale
float scale = 1.0f;
/** Initializes the bitmap used to render particles */
public BitmapParticleEffect(String filename, float scale)
{
// Load the plane with the wanted texture
mesh = MeshFactory.createAlphaPlane(filename, PolygonMode.CULL_BACK, 0xffffffff);
// Make sure we set the scale
this.scale = scale;
}
/**
* @see ParticleEffect#render(Particle, Graphics3D)
*/
public void render(Particle p, Graphics3D g3d)
{
// Calculate the alpha
int alpha = (int)(255 * p.getLife());
// Create the color
int color = p.getColor() | (alpha << 24);
// Set alpha
MeshOperator.setMeshAlpha(mesh, color);
// Transform
trans.setIdentity();
trans.postScale(scale, scale, scale);
float[] pos = p.getPos();
trans.postTranslate(pos[0], pos[1], pos[2]);
// Render
g3d.render(mesh, trans);
}
}
* Creates a nice fountain effect for the particles, that shoots particles
* in a certain direction, determined by its angle. The angle can be changed in real-time.
*/
public class FountainEffect extends BitmapParticleEffect
{
// The angle of particle emission
private int angle = 90;
// The sine and cosine of the current angle
private float[] trig = {1.0f, 0.0f};
// The emitting origin
private float[] pos = {0.0f, 0.0f, 0.0f};
// The randomizer
Random rand = null;
/**
* @param angle The angle of particle emission
*/
public FountainEffect(int angle)
{
// Init the bitmap
super("/res/particle.png", 0.05f);
// Set the angle
setAngle(angle);
// Get randomizer
rand = new Random();
}
/**
* @see ParticleEffect#init(Particle)
*/
public void init(Particle p)
{
// Set the particle's life
p.setLife(1.0f);
// Set the particle's position
p.setPos(pos);
// Create the particle's velocties
float[] vel = new float[3];
// We want velocities from 0.2f to 1.0f
float xyvel = rand.nextFloat() * 0.8f + 0.2f;
// We want the particle to die slowly
p.setDegradation(xyvel / 18);
// Set velocities according to trigonometry with a small deviation
vel[0] = xyvel * trig[1] + rand.nextFloat() * 0.125f - 0.0625f;
vel[1] = xyvel * trig[0] + rand.nextFloat() * 0.125f - 0.0625f;
// No movement in depth
vel[2] = 0.0f;
// Set the velocity
p.setVel(vel);
// Set the random color
int r = (int)(120 * rand.nextFloat()) + 135;
int g = (int)(120 * rand.nextFloat()) + 135;
int b = (int)(120 * rand.nextFloat()) + 135;
int col = (r << 16) | (g << 8) | b;
p.setColor(col);
}
/**
* @see ParticleEffect#update(Particle)
*/
public void update(Particle p)
{
// Simply update position
float[] ppos = p.getPos();
float[] vel = p.getVel();
ppos[0] += vel[0];
ppos[1] += vel[1];
ppos[2] += vel[2];
// Update life
p.setLife(p.getLife() - p.getDegradation());
// Check life. If it is dead, we just reinit it
if(p.getLife() < -0.001f)
{
init(p);
}
}
/**
* @param angle The angle to set.
*/
public void setAngle(int angle) {
this.angle = angle;
trig[0] = (float)Math.sin(Math.toRadians(angle));
trig[1] = (float)Math.cos(Math.toRadians(angle));
}
/**
* @return Returns the angle.
*/
public int getAngle() {
return angle;
}
/**
* @param pos The pos to set.
*/
void setEmittingOrigin(float[] pos) {
this.pos = pos;
}
/**
* @return Returns the pos.
*/
float[] getEmittingOrigin() {
return pos;
}
}
if(key[LEFT])
fx.setAngle(fx.getAngle() + 5);
if(key[RIGHT])
fx.setAngle(fx.getAngle() - 5);
Now that we've done everything, I'll just show you how I've set up the immediate mode rendering. You should already know this since I went through it earlier in this part of the tutorial, so I'll just repeat the info. Here is how we construct the canvas:
*/
public M3GCanvas(int fps)
{
// We don't want to capture keys normally
super(true);
// We want a fullscreen canvas
setFullScreenMode(true);
// Load our camera
loadCamera();
// Load our background
loadBackground();
// Set up graphics 3d
setUp();
}
private void setUp()
{
// Get the instance
g3d = Graphics3D.getInstance();
// Add a light to our scene, so we can see something
g3d.addLight(createAmbientLight(), identity);
}
/** Creates a simple ambient light */
private Light createAmbientLight()
{
Light l = new Light();
l.setMode(Light.AMBIENT);
l.setIntensity(1.0f);
return l;
}
try
{
// Get the Graphics3D context
g3d = Graphics3D.getInstance();
// First bind the graphics object. We use our pre-defined rendering hints.
g3d.bindTarget(g, true, RENDERING_HINTS);
// Clear background
g3d.clear(back);
// Bind camera at fixed position in origo
g3d.setCamera(cam, identity);
// Init particles
if(ps == null)
{
fx = new FountainEffect(90);
ps = new ParticleSystem(fx, 20);
}
// Emit the particles
ps.emit(g3d);
// Check controls for fountain rotation
if(key[LEFT])
fx.setAngle(fx.getAngle() + 5);
if(key[RIGHT])
fx.setAngle(fx.getAngle() - 5);
// Quit if user presses fire
if(key[FIRE])
TutorialMidlet.die();
}
catch(Exception e)
{
reportException(e);
}
finally
{
// Always remember to release!
g3d.releaseTarget();
}
So to wrap it all up, here are a few screenshots of the code in action. Pretty, pretty colors!
- Redikod_3D_tutorial_part_3_source_code.zip (10.1 KB)
- 下载次数: 2
发表评论
-
对Java的I/O流理解
2011-02-19 23:04 1966这是很久前另一个BLOG上的,现在不用了。转过来吧,方便查看. ... -
A*寻路(J2ME实现)
2011-02-19 23:00 1298这是很久前另一个BLOG上的,现在不用了。转过来吧,方便查看. ... -
J2ME上检测是否支持特定的API
2011-02-19 22:59 1519这是很久前另一个BLOG上的,现在不用了。转过来吧,方便查看. ... -
J2me paint[转]
2011-02-19 22:58 1432这是很久前另一个BLOG上的,现在不用了。转过来吧,方便查看. ... -
[JSR-184][3D编程指南(译文)]第一部分:快速进入移动JAVA 3D编程世界
2011-01-23 00:37 1731[英文原文&源码下载] ... -
[JSR-184][3D编程指南]Part V: Heightmap terrain rendering using M3G
2011-01-22 23:13 1884<!-- 整理收集自网络,收藏以便日后查阅 --> ... -
[JSR-184][3D编程指南]Part IV:M3G built-in collision,light physics and camera perspec
2011-01-22 23:04 2128<!-- 整理收集自网络,收藏以便日后查阅 --> ... -
[JSR-184][3D编程指南]Part III: Particle systems and immediate mode rendering (1)
2011-01-22 22:48 2222<!-- 整理收集自网络,收藏以便日后查阅 --> ... -
[JSR-184][3D编程指南]Part II: Light 3D theory and orientation
2011-01-22 22:29 1522<!-- 整理收集自网络,收藏以便日后查阅 --> ... -
[JSR-184][3D编程指南]Part I: Quick jump into the world of Mobile Java 3D programming
2011-01-22 22:07 2320<!-- 整理收集自网络,收藏以便日后查阅 --> ... -
[JSR-184][3D编程指南]目录索引
2011-01-22 21:25 1417Series of 3D programming tutori ... -
[Kuix][转]Kuix的事件处理机制
2009-10-08 18:19 1653原文连接 kuix这 ... -
[积累]getResourceAsStream()返回null的问题
2009-03-13 22:04 2669getResourceAsStream()可以获取JAR包内的 ... -
[资料]根据J2ME(MIDP)虚拟机对程序编写的优化方式
2009-02-27 09:39 14461、关于虚拟机 我认为 ... -
[资料]MIDP2.0中如何通过代码画半透明的圆和椭圆
2009-02-27 09:10 1607最近在做一个小Demo时,需要画一个半透明的圆,看遍M ... -
[资料]MIDP设计模式之集结贴[JavaME]
2009-02-23 22:07 13951: 架构性宣言: MI ... -
[资料]MVC在J2ME项目中的应用之MVC慨述
2009-02-23 21:48 1271内容提要: 本文简要的介绍了MVC模式的思想,并分析了M ... -
[资料]基于MVC模式的J2ME应用程序框架设计
2009-02-23 21:24 2856原文:http://www.mcu123.com/ ... -
[资料]线程在J2ME应用中的使用
2009-02-22 17:05 1601简要说明: 非常好的一篇文章,谈论到了线程各个方面的问题 ... -
[JSR-135][资料]渐进式下载
2009-02-22 16:17 1896Progressive download ...
相关推荐
【JSR-184】3D编程指南第二部分:光照3D理论与定向 在3D编程领域,理解和应用光照理论以及物体的定向是至关重要的。JSR-184,全称Java 3D API,是Java平台上的一个标准,它提供了用于创建和操作3D图形的强大工具。...
总之,【JSR-184】【3D编程指南】Part IV探讨了移动3D图形中的关键元素,包括碰撞检测、光照物理和相机视角控制,这些是创建生动、交互性3D应用的基础。通过学习和实践,开发者可以利用JSR-184在移动设备上创建...
【JSR-184】是Java Micro Edition (Java ME)平台中的一项规范,全称为“Mobile 3D Graphics API”。这个规范旨在为移动设备提供3D图形编程接口,使得开发者能够在小型设备上构建丰富的三维应用程序,如游戏、虚拟...
总之,【JSR-184】的3D编程指南Part V专注于使用M3G在移动设备上实现基于高度图的地形渲染,涉及图像处理、3D网格构建、纹理映射、光照以及移动平台的图形渲染技术。通过实践和理解这些概念,你将能够创建出逼真的3D...
java.lang.ClassNotFoundException: javax.measure.converter.ConversionException所需的jar
赠送jar包:undertow-websockets-jsr-2.1.7.Final.jar; 赠送原API文档:undertow-websockets-jsr-2.1.7.Final-javadoc.jar; 赠送源代码:undertow-websockets-jsr-2.1.7.Final-sources.jar; 赠送Maven依赖信息...
赠送jar包:undertow-websockets-jsr-2.1.7.Final.jar; 赠送原API文档:undertow-websockets-jsr-2.1.7.Final-javadoc.jar; 赠送源代码:undertow-websockets-jsr-2.1.7.Final-sources.jar; 赠送Maven依赖信息...
赠送jar包:jsr311-api-1.1.1.jar; 赠送原API文档:jsr311-api-1.1.1-javadoc.jar; 赠送源代码:jsr311-api-1.1.1-sources.jar; 赠送Maven依赖信息文件:jsr311-api-1.1.1.pom; 包含翻译后的API文档:jsr311-api...
**JSR-135编程指导** JSR-135,全称为JavaTM Media Framework API,是Java ME(J2ME)平台中用于多媒体应用开发的重要规范。它为移动和嵌入式设备提供了处理音频、视频和图像的能力,使得开发者能够创建功能丰富的...
赠送jar包:undertow-websockets-jsr-2.2.14.Final.jar; 赠送原API文档:undertow-websockets-jsr-2.2.14.Final-javadoc.jar; 赠送源代码:undertow-websockets-jsr-2.2.14.Final-sources.jar; 赠送Maven依赖信息...
赠送jar包:undertow-websockets-jsr-2.2.14.Final.jar; 赠送原API文档:undertow-websockets-jsr-2.2.14.Final-javadoc.jar; 赠送源代码:undertow-websockets-jsr-2.2.14.Final-sources.jar; 赠送Maven依赖信息...
用jsr184编写的手机3d编程实例,用户可以任意旋转箭头,放大缩小等等。包含如何使用数据定义mesh,如何操作camera如何旋转等等,程序功能较繁杂,但是界面较粗糙(数据定义的模型当然是越简单越好啦),学习意义大于...
《3-D Game Development on JSR-184 v1_0_3》是关于使用Java 3D技术在J2ME平台上开发3D游戏的一份重要资料,它为初学者提供了一个宝贵的入门教程。JSR-184,全称为Java ME 3D API,是Java Micro Edition(J2ME)平台...
赠送jar包:jackson-datatype-jsr310-2.12.5.jar; 赠送原API文档:jackson-datatype-jsr310-2.12.5-javadoc.jar; 赠送源代码:jackson-datatype-jsr310-2.12.5-sources.jar; 赠送Maven依赖信息文件:jackson-...
赠送jar包:jackson-datatype-jsr310-2.11.4.jar; 赠送原API文档:jackson-datatype-jsr310-2.11.4-javadoc.jar; 赠送源代码:jackson-datatype-jsr310-2.11.4-sources.jar; 赠送Maven依赖信息文件:jackson-...
在实际开发中,我们通常会将这两个库添加到项目的类路径中,然后通过配置和编程API来启用和使用JSR-303验证。例如,在Spring框架中,可以通过在配置文件中启用注解驱动的验证,或者在代码中创建Validator实例并调用...
标题:WebBeans -- JSR-299 描述:WebBeans是Gavin King的力作,专注于为Java EE平台提供上下文与依赖注入(Contexts and Dependency Injection)。 ### WebBeans (JSR-299) 知识点详解 #### 一、架构与合同 Web...
赠送jar包:jackson-datatype-jsr310-2.11.4.jar; 赠送原API文档:jackson-datatype-jsr310-2.11.4-javadoc.jar; 赠送源代码:jackson-datatype-jsr310-2.11.4-sources.jar; 赠送Maven依赖信息文件:jackson-...