Sunday, April 27, 2014

X3DOM - Day 3

X3DOM


%-------------------------------------------------------------------------------

Goal : Multi-channel visualization based on the X3DOM

Todo :

  1. Study
  • How to apply shader for edge blending and warping? 
  • (Added)How to apply render texture?
  • How to manipulate camera projection matrix on X3DOM?
  • How to synchronize variables using js?
    2. Design
  • Web-based system
  • What will be the main difference with multi-channel visualization using Unity3D?
  • Connect-and-play?
    3. Implementation
  • TBA...
%-------------------------------------------------------------------------------

  • How to manipulate camera projection matrix on X3DOM?

- It takes about a full week to do this.

- At first, X3DOM is under development now so the functionality is somewhat limited.

- Also, I think the documentation should supplemented more.

- Which leads me to see the source codes of X3DOM libraries, which is unnecessary until now.

- You can find source codes in HERE

  1. Development Environments
- Until the last posting, I only used notepad(EDITPLUS, actually) to editing the code.

- I found the FIREBUG, which is a nice web development tool runs in the Firefox browser.

- Also this tool enables the debugging with the breakpoints, which helped me A LOT!!!


     2. <RenderedTexture> and <ViewFrustum>

- Currently, <renderedtexture> node only works(rendered) well with <viewpoint> node reference.

- "Works well" means automated update according to the changes on the camera.

- But I need to manipulate the projection matrix of the camera, to have a proper multi-channel viewfrustum.

- So, with <ViewFrustum> node, I had to find the way to update rendered texture.

- I checked the main camera returns the right value of position and orientation, through event listener. The code snippets are like below

%---------------------
function addIt() { //onload, initialize the cam01 node with 'viewpointChanged' listener
            document.getElementById('cam01').addEventListener('viewpointChanged', viewFunc, false);
};

var viewFunc = function(evt) {
pos = evt.position; //position. vector 3
        rot = evt.orientation; // orientation. quaternion(vector 4)
mat = evt.matrix;// ???. matrix 4 by 4
}
%-----------------------------

- In short, I need to manually make a MV(modelview) matix from position and orientation values.

- I failed all, since I don't know how exactly the 'modelview' attribute is handled.

- Also the evt.matrix is really weird one, which is somewhat similar with MV matrix but not exactly.

- I solved the problem by tracking the source code which generates the evt.matrix, and inversely calculating the MV matrix of the original camera.

- Actually it was very simple.

%---------------------
var t = document.getElementById('cam01'); //cam1 object
var m = t._x3domNode.getCurrentTransform(); //cam1 transform
var m2 = m.mult(mat); //cam1 transform * mat
var m3 = m2.inverse(); //inverse of it
var m4 = m3.transpose();// and transpose
%-----------------------------

- the m4 matrix is the original MV matrix.

- And then passing the value of the MV matrix to the <viewfrustum> node of <renderedtexture> is enough for synchronization.


You can see the camera position & orientation for 
rendered texture is synchronized with the original scene camera.

Only the projection matrix is different, which I intended, 
for multi-channel visualization.



※ All of my web pages showing results are available only when I'm running my web-server.
%------------------------------------------------------------------------------

Additional Resources

    Sunday, April 13, 2014

    X3DOM - Day 2

    X3DOM


    %-------------------------------------------------------------------------------

    Goal : Multi-channel visualization based on the X3DOM

    Todo :

    1. Study
    • How to apply shader for edge blending and warping? 
    • (Added)How to apply render texture?
    • How to manipulate camera projection matrix on X3DOM?
    • How to synchronize variables using js?
        2. Design
    • Web-based system
    • What will be the main difference with multi-channel visualization using Unity3D?
    • Connect-and-play?
        3. Implementation
    • TBA...
    %-------------------------------------------------------------------------------

    • (Added)How to apply render texture?

    - See THIS EXAMPLE.

    - By using <renderedTexture> node(But not described in THIS PAGE. Why?)

    - Camera(=<viewpoint>) is referred in the <renderedTexture> node.

    - Also, <group> node, which contains contents of the scene, has to be referred.
    If not, the rendered texture is mixed up with the background(?) scene like below, as well as the viewpoint setting.



    - Next, the <composedShader> from the last posting is used to render the image from the camera.

    - We can separate the vertex shader and fragment shader for the independent use.

    1. Vertex Shader --> setting the viewport of the rendering result.
    2. Fragment Shader --> edge blending fct.

    - And...HERE is my result.


    The scene composed with a red box and a camera.

    The left side shows the rendered texture image with edge blending shader.
    (you can see the attenuation at both ends like the result in DAY1.)

    The right side shows the rendered texture image with default fragment shader.


    ※ All of my web pages showing results are available only when I'm running my web-server.

    %------------------------------------------------------------------------------

    Additional Resources

    Wednesday, April 9, 2014

    Mesh Comparison - Day 1

    DSME-(Histogram-based) Qualitative Mesh Comparison in between ship CAD systems.

    %-------------------------------------------------------------------------------

    Goal : Develop the module for 3D mesh comparison, in ACIS environment

    Todo :

    1. Study
    • Qualitative shape (search, detection, matching, comparison, ...)
    • MeshLab(VCG) or OpenMesh library
    • ACIS
        2. Implementation
    • TBA...
    %-------------------------------------------------------------------------------

    • Qualitative shape (search, detection, matching, comparison, ...)

    1. Global feature-based --> "...fail to capture the specific details of a shape..." less preferred
    2. Manufacturing feature-based --> we don't have the required input data
    3. Graph based --> B-Rep / Voxel based? less preferred
    4. Histogram-based
    5. Product information based --> we don't have the required input data
    - Histogram-based researches[1][2][3]

    - Conceptually divided into 1) alignment and 2) comparison stage.[Hanra, Soonjo]

    - Obviously, defining the distance metric will be the key-point

    - On the other hand, the research objective is strictly limited to the parts for shipbuilding,

    so maybe we can make contribution in that circumstances.


    - MeshLab is a tool based on the VCG.

    - Not used before in library level(VCG), only the MeshLab script is used in Macro-Parametrics project.

    - OpenMesh is a library based on half-edge data structure of the mesh, used before on the work

    presented in ACDDE 2012.

    - Both OpenMesh and VCG has a default mesh comparison functionality.

    - On OpenMesh(OpenFlipper)


    Comparison result between the sphere and box.
    Euclidean distance measurement is applied.

    - On MeshLab


    Comparison result between the sphere and box(box has different dimension with above test case.)
    Hausdorff distance measurement is applied.

    - Alignment should be further investigated. Both tools provides only manual alignment interface.



    %------------------------------------------------------------------------------

    Additional Resources


    Saturday, April 5, 2014

    X3DOM - Day 1

    X3DOM

    %-------------------------------------------------------------------------------

    Goal : Multi-channel visualization based on the X3DOM

    Todo :

    1. Study
    • How to apply shader for edge blending and warping?
    • How to manipulate camera projection matrix on X3DOM?
    • How to synchronize variables using js?
        2. Design
    • Web-based system
    • What will be the main difference with multi-channel visualization using Unity3D?
    • Connect-and-play?
        3. Implementation
    • TBA...
    %-------------------------------------------------------------------------------

    • How to apply shader for edge blending and warping?

    - X3DOM supports GLSL,HLSL,Cg shader language

    - <ComposedShader> and <ShaderPart>node is used until now

    - Shader code can be separated with x3d body, which will be tested later


    Toon Shading Test (done)
    • Modify a little for my own from examples HERE
    Edge Blending Test (ongoing)
    • Shader of existing edge blending(for Unity3D use) is converted in GLSL for test
    • You can see the attenuation on the side parts.
    • Warping also implemented, but the result shows something wrong with the UV coord. Should be checked later.


    %------------------------------------------------------------------------------

    Additional Resources


    1. Existing multi-channel visualization environment using Unity3D



       2. Edge blending and warping shader for each channel.


      3. Links for shder-related X3DOM documents