3D experience engine is a piece of complex software that
co-ordinates all the different types of data and synchronizes
them. It uses the GPU to create a complete 3D interface and
uses the GPU hardware to accelerate the playback of different
video and 3D files.
a block diagram of how it works.
you can see 3D Experience can handle almost every video and
audio file format as long as you have a Direct Show codec for
it. It can transcode from format to format, MPEG2 to Dvix to
WMV etc. In addition to 3D models 3D Experience can load
complete 3D models, parts of 3D models and a 3D UI.
Experience takes these data types and synchronizes them into
"Chunks" of data. These chunks can contain anything
from audio, video, 3D meshes, animation data, texturing data,
UV mapping data and more. The animation manager can accept
meshes and animation data from any published modelling package
but we have not yet released a plugin for the mesh formater.
"Chunk" contains all the data necessary to create
a frame of multimedia. There can be as many frames in a second
as you wish but there typically 25/50 frames per second or
30/60 frames per second or 24 frames per second (feature
Experience creates a connection to DirectShow and DirectX by
using the GPU's memory and a patented technique called Managed
Networked Render Targets (MNRT) see here.
3D Experience OpenGL ES is presently being developed with the
assistance of Mobile graphics chip manufacturers.
Video Acceleration (DXVA), new mesh acceleration techniques
and direct memory management is used to fully take advantage
of the graphics hardware. Video and images are mixed within a
3D scene and it is within this scene that pixel shaders and
vertex shaders can be used to add special effects.
the content can be output in various different formats from
WMV HD to MPEG4 and even some initial MPEG21 formats like
Experience has been used to rapidly create a whole range of
products. We hope you enjoy using our products.