- 浏览: 816073 次
- 性别:
- 来自: 广州
-
最新评论
-
mixture:
语句int num1, num2;的频度为1;语句i=0;的频 ...
算法时间复杂度的计算 [整理] -
zxjlwt:
学习了。http://surenpi.com
[问题解决]Error: ShouldNotReachHere() [整理] -
Animal:
谢谢 楼主 好东西
算法时间复杂度的计算 [整理] -
univasity:
gaidandan 写道缓存失败,,模拟器上可以缓存,同样代码 ...
[开发总结]WebView使用中遇到的一些问题&解决 -
blucelee2:
那么麻烦干吗,而且这种方法会导致,当拉太小的时候样式会丢掉,整 ...
[SWT]SashForm中固定单侧大小(&实现面板隐藏)
[JSR-184][3D编程指南]Part III: Particle systems and immediate mode rendering (1)
- 博客分类:
- J2me
<!-- 整理收集自网络,收藏以便日后查阅 -->
Introduction
Welcome to the third installment of the M3G tutorial! Today I'll go through how to gain total control over the rendering process (immediate rendering) and how to create a very nice particle system. Again, here are some links, in case you get lost: First of all, and probably most importantly, the dedicated Mobile Java 3D web section on Sony Ericsson Developer World. Second, if you ever get stuck, go to the Sony Ericsson Mobile Java 3D forum . For everything else, use the Sony Ericsson World Developer web portal , there you will find the answers to your questions and more. The goal of this tutorial is to show you how to render the same object several times, with different transformations. This is called immediate mode. You'll learn how powerful such a mode is for very many things. Also, this tutorial will be the base of the more advanced tutorials to come, since we'll be almost exclusively using immediate mode for rendering from now on. Since the code is meant for educational purposes it isn't optimal nor does it cover all the errors that might occur. These are more advanced topics that will be addressed later on. |
What you should know
Before you start reading this, you should have read the first two tutorials to have a somewhat firm grasp of basic M3G functionality.
- Part one: Quick jump into the world of Mobile Java 3D programming>>
- Part two: Light 3D theory and orientation>>
Retained Mode contra Immediate Mode Rendering
The Retained mode is the mode used when you render an entire world with
all the information that a world holds, including cameras and lights.
This is a pretty restricted mode, since we almost always want to draw a
single model multiple times, with different transformations, without
invoking an entire scene graph. So, when we render a single group, node
or a submesh in M3G it is called immediate mode rendering. These are the
immediate mode methods:
render ( VertexBuffer vertices, IndexBuffer triangles, Appearance appearance, Transform transform, int scope)
As you can see, all three of the methods require some kind of vertex data: Node and VertexBuffer/IndexBuffer. A node can basically be any part of a scene graph, even a world is considered a Node. Usually you'll pass a Mesh or a Group to the first render method. The VertexBuffer, which we talked about in the second tutorial, is a gathering of mesh data that describes a model in 3D space. The last two methods also require Appearance classes to understand how to display the mesh data. In this tutorial we'll only use the first method, to show you how immediate mode works.
Another thing that all methods have in common is that they all need a Transform class that describes the model's transformation from local to world space. Remember, we talked about this in the last tutorial . Also, a thing to remember is that most objects you want to render are Transformables. Meaning they have their own internal transformation matrix. However, the immediate mode rendering ignores all such transformation information and uses only the Transform matrix supplied to the method. This is actually very handy, as you can hold one Mesh of a spaceship in memory, but render it many times with different Transform matrices to display many different spaceships. You'll see this in action as we start to design the particle engine.
Is that all?
The "problem" with immediate mode rendering is that you need to take
care of more things before rendering, because you don't have the comfort
of a handy World class that stores all camera, background and lighting
information. So what you need to control manually now is the clearing of
the viewport buffer with a Background object, the lighting of your
scene and the camera.
Background
To render in immediate mode, we have to clear the viewport manually,
thus preparing for the next drawing cycle. This can be done either
before, or after a rendering loop, but it needs to be done after you've
bound the Graphics3D object and before you've released it. M3G uses a
Background class to help you with this. The Background class holds a lot
of nifty information such as what background color to clear the screen
with and what image to draw as a background. It is also very useful, for
you can use a large image as a background, but only show bits and parts
of it as you move around. For instance, you could have a large PNG of
your horizon, and then as the player moves in the game world, you can
move your Background's crop area to display other parts of the horizon.
Beware however, using large PNGs is not only very slow, but also very
memory inefficient. The most important methods of the Background class
are the following:
setColor (int ARGB)
setCrop (int cropX, int cropY, int width, int height)
setImageMode (int modeX, int modeY)
setImage ( Image2D image)

Image img = Image.createImage("/myimage.png");
// Initializes our background
{
back = new Background();
back.setColor(0);
}
{
// Here you bind your Graphics3D object
//...
// Now simply clear the screen
g3d.clear(back);
Another thing you need to control manually is the lighting. You need to create lights and position them in 3D space with Transform matrices. This is all done inside the Graphics3D class with the following methods:
setLight (int index, Light light, Transform transform)
resetLights ()
You also need to create your own camera, instead of just using the one supplied in the World class, as we've done before. In this tutorial, we'll only create a Camera by calling its default constructor. In later parts of the tutorial we'll go through the more advanced things you can do with a Camera such as changing the projection matrix. I won't talk more about this topic right now, instead I'll just show you a code snippet on how this can be done:
Graphics3D g3d = Graphics3D.getInstance();
g3d.setCamera(cam, getCameraTransform());
Setting the stage
Now you know of the three things that we need to control manually and you're ready to render something in immediate mode. Before I show you any code, let's recap the steps needed:
- We need to add lights to our Graphics3D object, which is usually done when the scene is being initialized.
- We need to add the camera to the Graphics3D object. You can choose to do this once, or every game loop, depending on how you handle the Camera's Transform matrix.
- We need to clear the background so we can render onto a freshly painted canvas.
- We just render our meshes and release.
g3d = Graphics3D.getInstance();
// First bind the graphics object. We use our pre-defined rendering hints.
g3d.bindTarget(g, true, RENDERING_HINTS);
// Clear background
g3d.clear(back);
// Bind camera at fixed position in origo
g3d.setCamera(cam, identity);
Particle Systems
A 3D-particle system usually consists of a data structure that represents a particle and its physical qualities (velocity, life and position) and of a system that handles the emittance of particles. This is a very simple model as you can make a particle system as complex as you wish. So, let's first create our Particle class. To represent a Particle in 3D space we'll probably need its position in 3D space, consisting of an x, y and z coordinate. We also need its velocity, since we want the Particle to move around in the 3D world. We could also need the color of a particle, so that we can make different particles different colors. Finally, we'll also need the life of a particle. The life of a particle is how long it stays in the 3D universe before it is either discarded, or re-animated at a new position with new velocities and colors. Here is a Particle class that'll cover our basic needs:
* Holds all the information of a particle.
* A particle's alpha is controlled directly by its life. Its alpha is always
* life * 255.
*/
public class Particle
{
// The life of the particle. Goes from 1.0f to 0.0f
private float life = 1.0f;
// The degradation of the particle
private float degradation = 0.1f;
// The velocities of the particle
private float[] vel = {0.0f, 0.0f, 0.0f};
// The position of the particle
private float[] pos = {0.0f, 0.0f, 0.0f};
// The color of the particle (RGB format 0xRRGGBB)
private int color = 0xffffff;
/** Empty initialization */
public Particle()
{
}
/**
* Initializes the particle
* @param velocity Sets the velocity
* @param position Sets the position
* @param color Sets the color (no alpha)
*/
public Particle(float[] velocity, float[] position, int color)
{
setVel(velocity);
setPos(position);
this.setColor(color);
}
/**
* @param life The life to set.
*/
void setLife(float life) {
this.life = life;
}
/**
* @return Returns the life.
*/
float getLife() {
return life;
}
/**
* @param vel The vel to set.
*/
void setVel(float[] tvel) {
System.arraycopy(tvel, 0, vel, 0, vel.length);
}
/**
* @return Returns the vel.
*/
float[] getVel() {
return vel;
}
/**
* @param pos The pos to set.
*/
void setPos(float[] tpos) {
System.arraycopy(tpos, 0, pos, 0, pos.length);
}
/**
* @return Returns the pos.
*/
float[] getPos() {
return pos;
}
/**
* @param color The color to set.
*/
void setColor(int color) {
this.color = color;
}
/**
* @return Returns the color.
*/
int getColor() {
return color;
}
/**
* @param degradation The degradation to set.
*/
public void setDegradation(float degradation) {
this.degradation = degradation;
}
/**
* @return Returns the degradation.
*/
public float getDegradation() {
return degradation;
}
}
import javax.microedition.m3g.Graphics3D;
/**
* The interface that determines which effect the particle engine will display.
* The ParticleEffect class also holds information about the bitmap used
* for displaying particles (if any)
*/
public interface ParticleEffect
{
// Initializes a particle
public void init(Particle p);
// Updates a particle
public void update(Particle p);
// Renders a particle
public void render(Particle p, Graphics3D g3d);
}
/**
* Manages emission of particles in our 3D world
*/
public class ParticleSystem
{
// The effect
private ParticleEffect effect = null;
// The particles
Particle[] parts = null;
/**
* Creates a particle system that emits particles according to a defined effect.
* @param effect The effect that controls the behaviour of the particles
* @param numParticles The number of particles to emit
*/
public ParticleSystem(ParticleEffect effect, int numParticles)
{
// Copy the effect
setEffect(effect);
// Init the particles
parts = new Particle[numParticles];
for(int i = 0; i < numParticles; i++)
{
parts[i] = new Particle();
effect.init(parts[i]);
}
}
/** The method that does it all. Needs to be called every tick of a game loop */
public void emit(Graphics3D g3d)
{
for(int i = 0; i < parts.length; i++)
{
getEffect().update(parts[i]);
getEffect().render(parts[i], g3d);
}
}
/**
* @param effect The effect to set.
*/
public void setEffect(ParticleEffect effect) {
this.effect = effect;
}
/**
* @return Returns the effect.
*/
public ParticleEffect getEffect() {
return effect;
}
}
// Now we create a ParticleSystem with 20 particles
ParticleSystem pSys = new ParticleSystem(pFx, 20);
// Somewhere inside our game loop...
To represent a particle in 3D space the best thing would be to use a Mesh that consists of a simple textured Quad. A Quad, as you might remember, is actually two triangles arranged so that they represent a square. Now, instead of creating a Mesh in 3D studio and exporting it as M3G and loading it into our program, it's much easier and faster to create the Mesh in code. If you remember the last tutorial, a model consists of faces, that themselves are composed of 3D-points, or vertrices. So to create a Quad, we'll need four points, one for each corner. In M3G a model is described by the Mesh class, which holds all kinds of information such as vertrices, texture coordinates, faces, polygon rendering modes, etc. We'll be creating a Mesh class from code. To be created, the Mesh class needs three things to display a model: a VertexBuffer, an IndexBuffer and an Appearance. Let's see how we'll create all of them in order.
The VertexBuffer
This class is a very handy one. It holds a lot of information about a model, including vertrices, texture coordinates, normals and colors. For our very simple model, we'll need vertrices and texture coordinates. We won't be using normals or colors this time since we don't need them. Now, the VertexBuffer stores vertex and texture information in a class called the VertexArray. The VertexArray is a pretty simple class that internally stores values in an array. When you create it you define how many elements each of your points has and how many bytes each element will occupy. Now you might be wondering, why do I choose the number of elements? Aren't 3D-coordinates always a triple of coordinates; x, y and z? Well, you are right of course, 3D-coordinates are always placed along three axes and do have three elements. However, there are other coordinates that are interesting as well, such as texture coordinates. In this example we will use a simple texture coordinate model that only uses pairs of coordinates. Now, before we actually start creating our VertexArrays, let's look at the coordinates. Here are the vertrices of our model (a simple limited plane with four corners).
short vertrices[] = new short[] {-1, -1, 0,
1, -1, 0,
1, 1, 0,
-1, 1, 0};
short texCoords[] = new short[] {0, 255,
255, 255,
255, 0,
0, 0};
vertexArray = new VertexArray(vertrices.length/3, 3, 2);
vertexArray.set(0, vertrices.length/3, vertrices);
texArray = new VertexArray(texCoords.length / 2, 2, 2);
texArray.set(0, texCoords.length / 2, texCoords);
发表评论
-
对Java的I/O流理解
2011-02-19 23:04 1987这是很久前另一个BLOG上的,现在不用了。转过来吧,方便查看. ... -
A*寻路(J2ME实现)
2011-02-19 23:00 1337这是很久前另一个BLOG上的,现在不用了。转过来吧,方便查看. ... -
J2ME上检测是否支持特定的API
2011-02-19 22:59 1539这是很久前另一个BLOG上的,现在不用了。转过来吧,方便查看. ... -
J2me paint[转]
2011-02-19 22:58 1456这是很久前另一个BLOG上的,现在不用了。转过来吧,方便查看. ... -
[JSR-184][3D编程指南(译文)]第一部分:快速进入移动JAVA 3D编程世界
2011-01-23 00:37 1777[英文原文&源码下载] ... -
[JSR-184][3D编程指南]Part V: Heightmap terrain rendering using M3G
2011-01-22 23:13 1911<!-- 整理收集自网络,收藏以便日后查阅 --> ... -
[JSR-184][3D编程指南]Part IV:M3G built-in collision,light physics and camera perspec
2011-01-22 23:04 2149<!-- 整理收集自网络,收藏以便日后查阅 --> ... -
[JSR-184][3D编程指南]Part III: Particle systems and immediate mode rendering (2)
2011-01-22 22:56 1566<!-- 整理收集自网络,收藏以便日后查阅 --> ... -
[JSR-184][3D编程指南]Part II: Light 3D theory and orientation
2011-01-22 22:29 1559<!-- 整理收集自网络,收藏以便日后查阅 --> ... -
[JSR-184][3D编程指南]Part I: Quick jump into the world of Mobile Java 3D programming
2011-01-22 22:07 2377<!-- 整理收集自网络,收藏以便日后查阅 --> ... -
[JSR-184][3D编程指南]目录索引
2011-01-22 21:25 1447Series of 3D programming tutori ... -
[Kuix][转]Kuix的事件处理机制
2009-10-08 18:19 1674原文连接 kuix这 ... -
[积累]getResourceAsStream()返回null的问题
2009-03-13 22:04 2705getResourceAsStream()可以获取JAR包内的 ... -
[资料]根据J2ME(MIDP)虚拟机对程序编写的优化方式
2009-02-27 09:39 14671、关于虚拟机 我认为 ... -
[资料]MIDP2.0中如何通过代码画半透明的圆和椭圆
2009-02-27 09:10 1629最近在做一个小Demo时,需要画一个半透明的圆,看遍M ... -
[资料]MIDP设计模式之集结贴[JavaME]
2009-02-23 22:07 14301: 架构性宣言: MI ... -
[资料]MVC在J2ME项目中的应用之MVC慨述
2009-02-23 21:48 1286内容提要: 本文简要的介绍了MVC模式的思想,并分析了M ... -
[资料]基于MVC模式的J2ME应用程序框架设计
2009-02-23 21:24 2870原文:http://www.mcu123.com/ ... -
[资料]线程在J2ME应用中的使用
2009-02-22 17:05 1629简要说明: 非常好的一篇文章,谈论到了线程各个方面的问题 ... -
[JSR-135][资料]渐进式下载
2009-02-22 16:17 1923Progressive download ...
相关推荐
总之,这篇3D编程指南的Part III对于想要在Java ME平台上进行3D图形开发的程序员来说是一份宝贵的参考资料,它不仅介绍了粒子系统的设计与实现,还涵盖了立即模式渲染的实践技巧,有助于提升开发者在有限的移动设备...
嵌入式八股文面试题库资料知识宝典-华为的面试试题.zip
训练导控系统设计.pdf
嵌入式八股文面试题库资料知识宝典-网络编程.zip
人脸转正GAN模型的高效压缩.pdf
少儿编程scratch项目源代码文件案例素材-几何冲刺 转瞬即逝.zip
少儿编程scratch项目源代码文件案例素材-鸡蛋.zip
嵌入式系统_USB设备枚举与HID通信_CH559单片机USB主机键盘鼠标复合设备控制_基于CH559单片机的USB主机模式设备枚举与键盘鼠标数据收发系统支持复合设备识别与HID
嵌入式八股文面试题库资料知识宝典-linux常见面试题.zip
面向智慧工地的压力机在线数据的预警应用开发.pdf
基于Unity3D的鱼类运动行为可视化研究.pdf
少儿编程scratch项目源代码文件案例素材-霍格沃茨魔法学校.zip
少儿编程scratch项目源代码文件案例素材-金币冲刺.zip
内容概要:本文深入探讨了HarmonyOS编译构建子系统的作用及其技术细节。作为鸿蒙操作系统背后的关键技术之一,编译构建子系统通过GN和Ninja工具实现了高效的源代码到机器代码的转换,确保了系统的稳定性和性能优化。该系统不仅支持多系统版本构建、芯片厂商定制,还具备强大的调试与维护能力。其高效编译速度、灵活性和可扩展性使其在华为设备和其他智能终端中发挥了重要作用。文章还比较了HarmonyOS编译构建子系统与安卓和iOS编译系统的异同,并展望了其未来的发展趋势和技术演进方向。; 适合人群:对操作系统底层技术感兴趣的开发者、工程师和技术爱好者。; 使用场景及目标:①了解HarmonyOS编译构建子系统的基本概念和工作原理;②掌握其在不同设备上的应用和优化策略;③对比HarmonyOS与安卓、iOS编译系统的差异;④探索其未来发展方向和技术演进路径。; 其他说明:本文详细介绍了HarmonyOS编译构建子系统的架构设计、核心功能和实际应用案例,强调了其在万物互联时代的重要性和潜力。阅读时建议重点关注编译构建子系统的独特优势及其对鸿蒙生态系统的深远影响。
嵌入式八股文面试题库资料知识宝典-奇虎360 2015校园招聘C++研发工程师笔试题.zip
嵌入式八股文面试题库资料知识宝典-腾讯2014校园招聘C语言笔试题(附答案).zip
双种群变异策略改进RWCE算法优化换热网络.pdf
内容概要:本文详细介绍了基于瞬时无功功率理论的三电平有源电力滤波器(APF)仿真研究。主要内容涵盖并联型APF的工作原理、三相三电平NPC结构、谐波检测方法(ipiq)、双闭环控制策略(电压外环+电流内环PI控制)以及SVPWM矢量调制技术。仿真结果显示,在APF投入前后,电网电流THD从21.9%降至3.77%,显著提高了电能质量。 适用人群:从事电力系统研究、电力电子技术开发的专业人士,尤其是对有源电力滤波器及其仿真感兴趣的工程师和技术人员。 使用场景及目标:适用于需要解决电力系统中谐波污染和无功补偿问题的研究项目。目标是通过仿真验证APF的有效性和可行性,优化电力系统的电能质量。 其他说明:文中提到的仿真模型涉及多个关键模块,如三相交流电压模块、非线性负载、信号采集模块、LC滤波器模块等,这些模块的设计和协同工作对于实现良好的谐波抑制和无功补偿至关重要。
内容概要:本文探讨了在工业自动化和物联网交汇背景下,构建OPC DA转MQTT网关软件的需求及其具体实现方法。文中详细介绍了如何利用Python编程语言及相关库(如OpenOPC用于读取OPC DA数据,paho-mqtt用于MQTT消息传递),完成从OPC DA数据解析、格式转换到最终通过MQTT协议发布数据的关键步骤。此外,还讨论了针对不良网络环境下数据传输优化措施以及后续测试验证过程。 适合人群:从事工业自动化系统集成、物联网项目开发的技术人员,特别是那些希望提升跨协议数据交换能力的专业人士。 使用场景及目标:适用于需要在不同通信协议间建立高效稳定的数据通道的应用场合,比如制造业生产线监控、远程设备管理等。主要目的是克服传统有线网络限制,实现在不稳定无线网络条件下仍能保持良好性能的数据传输。 其他说明:文中提供了具体的代码片段帮助理解整个流程,并强调了实际部署过程中可能遇到的问题及解决方案。
基于C#实现的检测小说章节的重复、缺失、广告等功能+源码+项目文档,适合毕业设计、课程设计、项目开发。项目源码已经过严格测试,可以放心参考并在此基础上延申使用,详情见md文档 基于C#实现的检测小说章节的重复、缺失、广告等功能+源码+项目文档,适合毕业设计、课程设计、项目开发。项目源码已经过严格测试,可以放心参考并在此基础上延申使用,详情见md文档~ 基于C#实现的检测小说章节的重复、缺失、广告等功能+源码+项目文档,适合毕业设计、课程设计、项目开发。项目源码已经过严格测试,可以放心参考并在此基础上延申使用,详情见md文档 基于C#实现的检测小说章节的重复、缺失、广告等功能+源码+项目文档,适合毕业设计、课程设计、项目开发。项目源码已经过严格测试,可以放心参考并在此基础上延申使用,详情见md文档 基于C#实现的检测小说章节的重复、缺失、广告等功能+源码+项目文档,适合毕业设计、课程设计、项目开发。项目源码已经过严格测试,可以放心参考并在此基础上延申使用,详情见md文档 基于C#实现的检测小说章节的重复、缺失、广告等功能+源码+项目文档,适合毕业设计、课程设计、项目开发。项目源码已经过严格测试,可以放心参考并在此基础上延申使用,详情见md文档