- 浏览: 167345 次
- 性别:
- 来自: 广州
文章分类
最新评论
-
兰斯洛特1987:
顶!!!!谢谢分享.最近我也在研究这玩意...
Java语言的Hook实现 -
pu02203:
我把Confidant.jar, 丢进去eclipse, 里面 ...
重磅推出诛仙辅助软件第二波:Java版按键精灵 -
泣血端午:
Calculagraph 这是哪个类啊?
A星算法 -
haitaohehe:
我使用的是jstl1.0 可是在使用<c:set va ...
JSTL1.0和JSTL1.1的区别 -
micheal19840929:
学习楼主,我也测试一下~看看兼容性吧。lanlanzhilia ...
手机版飞鸽传书:无线牵
纹理(Texture)
java.lang.Object
|
+--javax.microedition.m3g.Object3D
|
+--javax.microedition.m3g.Transformable
|
+--javax.microedition.m3g.Texture2D
An Appearance component encapsulating a two-dimensional texture image and a set of attributes specifying how the image is to be applied on submeshes. The attributes include wrapping, filtering, blending, and texture coordinate transformation.
Texture image data
The texture image is stored as a reference to an Image2D. The image may be in any of the formats defined in Image2D. The width and height of the image must be non-negative powers of two, but they need not be equal. The maximum allowed size for a texture image is specific to each implementation, and it can be queried with Graphics3D.getProperties().
Mipmap level images are generated automatically by repeated filtering of the base level image. No particular method of filtering is mandated, but a 2x2 box filter is recommended. It is not possible for the application to supply the mipmap level images explicitly.
If the referenced Image2D is modified by the application, or a new Image2D is bound as the texture image, the modifications are immediately reflected in the Texture2D. Be aware, however, that switching to another texture image or updating the pre-existing image may trigger expensive operations, such as mipmap level image generation or (re)allocation of memory. It is therefore recommended that texture images not be updated unnecessarily.
Texture mapping
Transformation
The first step in applying a texture image onto a submesh is to apply the texture transformation to the texture coordinates of each vertex of that submesh. The transformation is defined in the Texture2D object itself, while the texture coordinates are obtained from the VertexBuffer object associated with that submesh.
The incoming texture coordinates may have either two or three components (see VertexBuffer), but for the purposes of multiplication with a 4x4 matrix they are augmented to have four components. If the third component is not given, it is implicitly set to zero. The fourth component is always assumed to be 1.
The texture transformation is very similar to the node transformation. They both consist of translation, orientation and scale components, as well as a generic 4x4 matrix component. The order of concatenating the components is the same. The only difference is that the bottom row of the matrix part must be (0 0 0 1) in case of a node transformation but not in case of a texture transformation. The methods to manipulate the individual transformation components of both node and texture transformations are defined in the base class, Transformable.
Formally, a homogeneous vector p = (s, t, r, 1), representing a point in texture space, is transformed to a point p' = (s', t', r', q') as follows:
p' = T R S M p
where T, R and S denote the translation, orientation and scale components, respectively, and M is the generic 4x4 matrix.
The translation, orientation and scale components of the texture transformation can be animated independently from each other. The matrix component is not animatable at all; it can only be changed using the setTransform method.
Projection
The texture transformation described above yields the transformed texture coordinates (s', t', r', q') for each vertex of a triangle. The final texture coordinates for each rasterized fragment, in turn, are computed in two steps: interpolation and projection.
1). Interpolation. The per-vertex texture coordinates are interpolated across the triangle to obtain the "un-projected" texture coordinate for each fragment. If the implementation supports perspective correction and the perspective correction flag in PolygonMode is enabled, this interpolation must perform some degree of perspective correction; otherwise, simple linear interpolation may (but does not have to) be used.
2). Projection. The first three components of the interpolated texture coordinate are divided by the fourth component. Formally, the interpolated texture coordinate p' = (s', t', r', q') is transformed into p'' = (s'', t'', r'', 1) as follows:
p'' = p'/q' = (s'/q', t'/q', r'/q', 1)
Again, if perspective correction is either not supported or not enabled, the implementation may do the projection on a per-vertex basis and interpolate the projected values instead of the original values. Otherwise, some degree of perspective correction must be applied. Ideally, the perspective divide would be done for each fragment separately.
The r'' component of the result may be ignored, because 3D texture images are not supported in this version of the API; only the first two components are required to index a 2D image.
Texel fetch
The transformed, interpolated and projected s'' and t'' texture coordinates of a fragment are used to fetch texel(s) from the texture image according to the selected wrapping and filtering modes.
The coordinates s'' and t'' relate to the texture image such that (0, 0) is the upper left corner of the image and (1, 1) is the lower right corner. Thus, s'' increases from left to right and t'' increases from top to bottom. The REPEAT and CLAMP texture wrapping modes define the treatment of coordinate values that are outside of the [0, 1] range.
Note that the t'' coordinate is reversed with respect to its orientation in OpenGL; however, the texture image orientation is reversed as well. As a net result, there is no difference in actual texture coordinate values between this API and OpenGL in common texturing operations. The only difference arises when rendering to a texture image that is subsequently mapped onto an object. In that case, the t texture coordinates of the object need to be reversed (t' = 1 - t). If this is not done at the modeling stage, it can be done at run-time using the texture transformation. Of course, the whole issue of texture coordinate orientation is only relevant in cases where existing OpenGL code and meshes are ported to this API.
Texture filtering
There are two independent components in the texture filtering mode: filtering between mipmap levels and filtering within a mipmap level. There are three choices for level filtering and two choices for image filtering, yielding the six combinations listed in the table below.
BASE_LEVEL |
NEAREST |
Point sampling within the base level | NEAREST |
BASE_LEVEL |
LINEAR |
Bilinear filtering within the base level | LINEAR |
NEAREST |
NEAREST |
Point sampling within the nearest mipmap level | NEAREST_MIPMAP_NEAREST |
NEAREST |
LINEAR |
Bilinear filtering within the nearest mipmap level | LINEAR_MIPMAP_NEAREST |
LINEAR |
NEAREST |
Point sampling within two nearest mipmap levels | NEAREST_MIPMAP_LINEAR |
LINEAR |
LINEAR |
Bilinear filtering within two nearest mipmap levels (trilinear filtering) | LINEAR_MIPMAP_LINEAR |
Only the first combination (point sampling within the base level) must be supported by all implementations. Any of the other five options may be silently ignored.
M3G限制纹理图片的长和宽都必须是2的次方,长和宽可以不一样。
在程序运行过程中,不建议修改与Texture2D对象关联的Image2D引用。
注意:纹理坐标(uv)与顶点坐标系的习惯略有不同,其原点在图片的左下角,左至右为正u方向,底至顶为正v方向。
完整形式的纹理坐标可以表示为(s,t,r,q),其中(s,t)对应一般三维建模软件中的uv也就是平面纹理图片的(x,y);r在使用三维纹理时使用,因为JSR184不支持三维纹理,所以可能会被忽略,此时会使用默认值0;q为齐次坐标,通常为1。
纹理贴图有固定模式WRAP_CLAMP和重复模式WRAP_REPEAT
对于平面四个点的纹理坐标定义如下:
short[] texCoords = new short[]{ 0,2, //左下 2,2, //右下 2,0, //右上 0,0 //左上 };
使用的纹理贴图:
对于固定模式WRAP_CLAMP,其效果如下:
texture.setWrapping(Texture2D.WRAP_CLAMP,Texture2D.WRAP_CLAMP);
若使用重复模式WRAP_REPEAT:
texture.setWrapping(Texture2D.WRAP_REPEAT,Texture2D.WRAP_REPEAT);
发表评论
-
J2ME的RMS
2010-04-10 23:32 2508在JAVAME中,程 ... -
M3G游戏中性能提升技巧
2010-03-28 17:59 749JSR184 M3G(Mobile 3D Grap ... -
在J2ME开发中获取系统属性
2010-03-27 18:21 516在J2ME开发中,我们经常需要和手机系统进行交互,获得一些和系 ... -
导出M3G文件指南
2010-03-25 12:54 949概述: 这 ... -
J2ME中使用microlog
2010-03-22 22:17 1404import javax.microedition.mi ... -
M3G教程:进阶篇(六)动画
2010-03-21 21:33 1308M3G中动画的数据结构如下: 【载入W ... -
3DS MAX导出M3G动画
2010-03-21 10:11 19931、用3D Studio Max或者Maya的插件h3texp ... -
M3G教程:进阶篇(四)模型
2010-03-21 01:18 1054import javax.microedition.lc ... -
M3G教程:进阶篇(二)灯光
2010-03-21 01:16 1660灯光(Lighting) java.lang.Obje ... -
M3G教程:进阶篇(一)金字塔
2010-03-21 01:15 1204关于World public class World ... -
M3G教程:入门篇
2010-03-21 01:14 16343D技术对我们来 ... -
一点对m3g模型文件解析的工作
2010-02-11 09:49 859因为最近不会在m3g文件上继续工作,把之前一点少少的工作放出来 ... -
J2ME 3d之3DMAX制作M3G错误二例
2010-02-11 09:35 0(1) 在制作J2ME 3D所需的M3G时出现导出 ... -
M3G教程:进阶篇(五)立即模式
2010-02-09 23:51 0保留模式和立即模式渲染 保留模式是当你使用一个世界它含有的全 ... -
KVM的类加载
2010-02-09 15:46 820首先简要介绍一下class文件的结构(详细内容请参考Java虚 ... -
手机版飞鸽传书:无线牵
2010-01-30 21:02 2921【中文名】无线 ... -
FileConnection简介(JSR-75)
2010-01-29 01:17 8861 引言 本文档 ... -
J2ME添加自定义图标
2010-01-23 23:52 1367与图标有关的是这两行: MIDlet-Icon: ... -
j2me签名相关注意事项
2010-01-23 23:45 2024我们得到一个证书后就可以对j2me的jad文件进行签名。这 ... -
JAD中文名字解决方法
2010-01-12 16:44 845最近正好在弄JAD,碰到中文无法显示的问题,之前就碰到过,但没 ...
相关推荐
通过M3G API,开发者可以创建复杂的3D场景,包括光照、纹理、动画以及物理模拟等元素,使得移动应用的视觉体验接近桌面级游戏。 **主要知识点**: 1. **基本概念**:理解M3G API中的关键术语,如节点(Node)、模型...
M3G查看器是一个独立的应用程序,用于查看3D图形文件格式的内容,该格式是对移动3D图形API(M3G)的补充。
M3G支持基本的3D几何形状创建、纹理映射、光照效果、动画以及视图控制等特性。 **J2ME 3D** 指的是使用J2ME进行3D图形编程的能力。J2ME是Java的一个子集,主要用于开发嵌入式和移动设备的应用程序。通过M3G API,...
【M3G 快速模式编程】:M3G,全称Mobile 3D Graphics API,是基于JSR 184规范定义的一种为移动设备提供标准3D图形功能的API。它分为快速模式和保留模式。快速模式专注于单个3D对象的渲染,适合进行低级别的3D操作,...
M3G(Mobile 3D Graphics)是Java ME(Micro Edition)平台上的一个三维图形标准,用于在移动设备上实现高质量的3D图形渲染。tk_m3gtk_v4_5.zip是一款专为M3G文件设计的查看器,它为Java J2ME开发者提供了一个必备的...
本文将深入探讨如何利用Java M3G(Mobile 3D Graphics)技术设计并实现“宝箧印塔”这一具有文化特色的三维模型。Java M3G是JSR 184(Mobile 3D Graphics API)的一部分,旨在为移动设备提供高效、轻量级的三维图形...
《Mobile 3D Graphics with OpenGL ES and M3G》是一本深入探讨移动设备上3D图形编程的专业书籍,主要关注于OpenGL ES和M3G这两个技术。OpenGL ES(OpenGL for Embedded Systems)是OpenGL的轻量级版本,专为嵌入式...
M3M0渗透测试工具 M3m0工具 :crossed_swords: 网站漏洞扫描程序和自动浏览器您可以使用此工具通过在网站中找到漏洞来检查安全性,也可以使用此工具来获取Shell | 污损| cPanels | 资料库 M3M0 :laptop:M3m0工具 :...
在提供的压缩包"jsr_184_midlet.rar_DEMO_jsr184_jsr184-m3g.jar_m3g"中,我们可以看到与JSR 184相关的几个关键元素: 1. **DEMO**:这是一个演示程序,用于展示JSR 184技术的实际应用。通过这个DEMO,开发者或用户...
这篇【3D编程指南】的第五部分主要探讨的是使用M3G来实现地形渲染,特别是基于高度图(Heightmap)的地形渲染技术。 首先,理解高度图的概念至关重要。高度图是一种2D图像,其中每个像素的灰度值代表一个3D网格中的...
1. **解析M3G文件**:使用J2ME的M3G API,开发者需要编写代码来读取M3G文件的二进制数据,并将其转换为可操作的对象,如顶点、纹理坐标、索引和动画数据。 2. **初始化3D环境**:设置3D场景,创建相机、光源等元素...
M3G标准支持包括几何、纹理、光照、动画等多种3D图形特性,使得开发者可以在资源有限的移动设备上实现复杂的3D场景。 M3GToolkit-0.5.0的核心功能在于其提供的EXE文件,位于解压后的BIN目录下。用户只需双击这个...
`M3GMidlet`可能包含了启动`M3GCanvas`实例的逻辑,以及与M3G API的交互,比如创建3D对象、设置光照、纹理映射等。 在学习Java M3G时,你需要了解以下核心概念: 1. **World Coordinate System**:M3G中的世界坐标...
m3u8在线下载工具_m3u8视频在线提取工具 在数字化的时代,网络视频已经成为我们娱乐和学习的重要来源。其中,M3U8是一种广泛用于流媒体传输的文件格式,尤其在高清视频领域。然而,直接下载M3U8格式的视频并不像下载...
【标题】"jsr184+M3G+API.rar" 涉及到的主要技术是Java Mobile 3D Graphics API(JSR 184)和M3G(Mobile 3D Graphics)标准,以及相关的API接口。JSR 184是Java Community Process发布的一个规范,目的是为Java ME...
M3G(Mobile 3D Graphics)是一种专为移动设备设计的三维图形标准,它使得在手机和掌上设备上实现高质量的3D图形成为可能。M3GTools则是针对这一标准开发的一套工具集,对于3D开发者来说,它是不可或缺的助手。 M3...
联想BIOS_L-IG41M3 V1.1 版本:DMKT05AUS 新圆梦F208 原机备份 支持Q9400 支持8G内存 需两条4G双面内存 两个BIOS文件 AFUWIN备份的BIOS BIOS_Backup_TooKit_V2.0备份的BIOS