Monday, March 21, 2016

Youtube 360 Video


  • 제작 방법 상세 설명 추가 (여기)
  • https://support.google.com/youtube/answer/6178631?hl=ko
  • You can change view angle while watching Youtube video if video was produced by equirectangular projected 360 image.

  • 1920x1080p 




Friday, March 18, 2016

Panoramic Rendering for CAVE facility

Overview


  • We have new CAVE in our lab consisted of three affordable home projectors-have low ANSI Lumens with full HD resolution, and screen that covers 180 degree FOV in horizon with spherical surface.


Screen configuration (right picture is top view)
  • We decided not to spend the time or budget for synchronization of clustered rendering. Instead, we installed Quadro to our rendering node.
  • Actual dimensions of a screen we measured and installed picture is shown below. You may notice the black cloths are attached on the screen. Because our projector have short throw distance, installation guys decided not to use some part of the screen. In diagram, that part was drawn like those covers same arc length in top part and bottom part, but actually it is not. I will discuss about it in later post. In this post, I ignored the asymmetry.
  • Actual radius is about 2.5m and vertical FOC is about 48.91 degree.
Picture of the installed screen and projectors


Warping and edge blending

  • Warping and edge blending is necessary for multi-projector system. You may want to get more information in http://paulbourke.net/
  • The installation company did this job for us using their calibration software and hardware. We don't have much things to do up to this point.
  • The picture before/after warping and edge blending is shown below.
(top) Before edge blending and warping (bottom) After edge blending and warping
  • After warping and edge blending is done, the image generated by our rendering node properly mapped to screen. The relation between four corners of rendered image and of screen is like below diagram, if we assume 100% accurate warping.


Panoramic rendering for 180 degree horizontal FOV


  • Unfortunately, this is not the end of the process if you want deliver the final image to user properly.
  • The first problem is that the image generated by single perspective camera cannot cover 180 degree FOV. Let's say we generate image has 5760x1080 pixels while did not changed default FOV set in Unity3D's camera (60 degree vertical FOV). Then by defined aspect ratio, rendered image contains only pixel information of about 144 degree horizontal FOV.
  • You can imagine how the image on our screen can be distorted by the horizontally "stretched" mapping, by diagram below.

Solution : Panoramic rendering

  • To solve this problem, we need panoramic rendering using several cameras.
  • Simply, we need the final image that has pixel information of [-90 deg, -24.46 deg] from camera on [0,0] uv coordinate and [90 deg, 24.46 deg] on [1,1] uv coordinate.
  • Usually you may want to use cubemap and map it to single image using equirectangular projection. But I decided to implement custom shader for study and I guess there should be performance problem by rendering 6 images instead of 3 that is all what we need to cover 180 and 48.91 deg. Also we can optimize the resolution by not producing power of two resolution texture that Unity is forcing for cubemap.
  • So I have three camera in my scene(front/left/right) each renders scene of 90 by 90 degree region. From those three rendered texture, pixel values of final 5760x1080 image is calculated by shader with process below.
    • Calculate corresponding latitude and longitude angle from UV coordinate of final image
    • Convert latitude and longitude angles to unit direction vector
    • Extend vector(ray) to front/left/right image plane
    • If ray hits, calculate local UV coordinates for hit position in front/left/right image plane
    • Return tex2D(front/left/right image , local UV coordinate), which is pixel value

Result video clip

  • Left quarter shows 3D scene and camera(red cube) and right image shows rendered final image

Adjusting projection point to actual eye position

  • Above final image seems okay, but we can also consider the actual user's eye position in the facility.
  • As you can see in the diagram below, the altitude of user's eye position is not located at the center of screen sphere.

Solution : Changing projection origin

  • We can calculate asymmetric upper and lower viewing angle(18.74 deg, 38.36 deg) from screen radius(2.5m), screen sphere to ground distance(1.25m), half of vertical FOV from sphere origin(24.46 deg) and actual eye position to ground distance(1.70m).
  • The process of panoramic rendering is similar with previous one. Changed part is the map between UV coords and lat, lon angle and changed origin to the actual eye position.
Diagram of actual eye position, ground plane and projection screen
Changed uv-lat,lon mapping

  •  Another change is that now we need bottom camera since now bottom part is shown a little due to the extension of lower viewing angle from 22.46 deg to 38.36 deg. Adding pixel information from bottom image to panoramic image is similar with others(front/left/right).
Final image from adjusted projection point and front/left/right image. Green pixels shows no pixel information exists
Filled pixel information from adding bottom image

Result video clip



  • You can see the asymmetry of viewing angle between upper and lower in vertical.

Observing result with application

  • We have excavator simulator for validation of safety control algorithm. Below picture shows the application rendered on screen 1) without any processing, 2) with panoramic rendering and 3) with panoramic rendering and eye position adjusting.
  • I take pictures using panorama capturing function on my smart phone. Three results are very different if you stand in front of the projection screen but it may seems not that dramatic in took pictures below. But I cannot find other better way to show the result.
Scene without any processing. 144 HFOV, 60 VFOV

Scene with panoramic rendering. 180 HFOV, 48.91 VFOV
Scene with panoramic rendering and eye position adjustment. 180 HFOV, 57.1 VFOV 
  • You may notice the effect of eye position adjustment by seeing the building in the 3D scene. Picture was taken right in front of my eye position.

The building in the left side in panoramic rendering scene
The building in the left side in panoramic rendering scene with eye position adjustment