Saturday 19 January 2019

PBR theory and implementation 3

  • Calculating the Lighting 

To calculate the irradiance that hit on the point we need to calculate the sum of all the lights from all directions that hit on the point.



To do this we need to calculate the integral of the incoming light over the dome.

given light strength on point p from direction ωi : L(p,ωi)



With all the knowledge of BRDF and irradiance, the Cook-Torrance reflectance equation will look like this:



Direct Light

For direct light, it is simple, we don't need to consider the integral over the whole dome.

We just need to scale the light by cos theta (The angle between the light vector and normal vector).

IBL

The environmental light is a bit tricky. Environmental light is the whole environment is light source and lit up the target surface. It is crucial to for realistic rendering. But calculate all light from all direction is nearly impossible in the realtime rendering (I say nearly here because the latest hardware and framework like RTX starts to support real-time raytrace rendering ) we use the environmental texture as the light source to mimic the environment light. This technique is call image base lighting or IBL.

Calculating intergral is very heavy in real time, to solve this problem,
we divide the render equation into two part :


left side is the diffuse part, right side is the specular part.

diffuse IBL

The result of the diffuse IBL only relate to ωi, sample direction on the environment map. This enable us precalculate the integral of the intergral of the the texture and svae it to a new texture, this is called irradiance map.

 In run time, we do :


an image from Wave engine to show the different between a environment map and irradiance map :



my inplementation detail :
I don't have the preculculated irradiance map , I could have write a CPP programme to caiculate it on CPU...

I simply use the mipmap to get the blur effect

Specular IBL

divide the right part into two parts:



  • Pre-filtered environmental map


The left part is called Pre-filtered environmental map, it calculate the sum of lights contribute to the final reflection. In diffuse light, lights from all direction are calculated. For specular only small part of the light are affecting the result. Perfect mirror has perfecti reflection, light can only be seen on reflect vector ; rough metal has a scattered reflection because more light can be observed. 





The rougher the surface is, lights from bigger range is affect the reflection at viewing direction.
The result will be








The higher the mipmap is , more sample is used to make the integral, this also means rougher the surface is.


  • BRDF intergration map






My implementation detail :

I use importance sampling to approximate the radiance intergral

I reference Real Shading in Unreal Engine 4 to implement the specular IBL

Tuesday 15 January 2019

Normal Normal Normal

Why normal map

To calculate the lighting on object we need to do the dot production between light and normal.

lightness = dot(lightVector , normal);

What is normal map

To record normal detail but keep the vertex on the objects down, normal map technique is used. Instead of grabbing normal directly from vertex information, we get it from normal map which records more normal details.

The regular normal map record a normal vector N(x,y,z) as a color (R,G,B). Of course to enhance the accuracy of the normal map there are other ways to encode the vector.

How normal map is constructed

To apply a 2D normal map into 3D space. Tangent space is created.
tangent space is build according to uv space,  the third axis is build by cross production of uv.
In Tangent space,  the normal map is recorded. When using, we transfer the tangent space normal back to object/ world space to calculate the lighting.

several advantage :
1. texture can be tiled.
2. when object deform,  the texture goes with the object.

coordinate transform

matrix multiplication is used for transform either coordinate or vector.

vector v in space A can be transform into space B by multiplying the 
A's expression in B. column vector use right multiplication,  row vector use left multiplication. D3D use left, open gl uses right.

refer to 3D math Primer for Graphics and Game Development 
page 104

Tangent space coordinate construction

in d3d we can get normal and tangent from vertex info
float3 N: NORMAL
float3 T : TANGENT
float3 B : BINORMAL

the Tangent space expression in Object space is :
MOT(B , T , N)

right multiplication mul(MOT ,v) 
transform vector from object space to tangent space

left multiplication mul(v , MOT)
transform vector from tangent space to object space

an orthogonal matrix's inverse equal to its transpose.

thus

mul(MOT ,v) = mul(v, transpose(MOT))

apply normal map in the shader

a good tutorial

To use the normal map on the model means : to transform the normal record in T space to O space.





Monday 14 January 2019

Houdini - erosion

Houdini terrain master class 1


user interface layout can be selected and reset in :
Menu -> Build

Heightfield 

height field is create around the center with the radius as half of the size axis.
grid spacing : large spacing create less resolution.

heightfield paint

paint mask on the height field, need to enable show handle to show the brush
lmb to paint mask
mmb to erase
mouse mmb scroll to change brush size
masked part will show red

heightfield noise

create noise on the height field
input1 : heightfield
input2 heightfield mask








combine Methods : calculation methods btw heighfield and noise


amplitude : change the maxim height of the noise


element size : size of noise


noise type : Worley (cellular ) Manhattan worley is very useful to create chunky flat area on terrain.

geometry and transform & heightfield

geometry can be projected onto a heightfield:


transform node transform the geometry
heightfiel_project project the geometry onto the heightfield.
input1 : heightfield
input2 : geometry

heightfield distort by noise

use noise field to moves the existing values
a compare btw ditort by noise and heightfield noise

heightfield layercomposite two heightfield layers

input 1: hf1
input2 : hf2
input3: maskheightfield File

heightfield file

heightfield file import image to heightfield

heightfiel distort by layer

the distortion can be control by another layer of heightfield 
the distortion keep the shape of the basic terrain layer as well as bring in some feature by the add on layer.
input1 : hf to be distort (terrain)
input2 : addon hf (noise)

compare of before distortion and after
heightfield mask clearer
clear the red mask visually for better

heightfield erosion

visulization: create color scheme for terrain

Place to reset simulation and freeze a frame (where you decided is the final result and don't want to touch it anymore)
to start an erosion : left bottom corner

Global Erosion Rate : higher rate has higher erosion effect

MainTab
  • Hydro : erosion cause by water
Bank Angle : smaller value create wider and flatter river bank, high value cut deep into terrain, a rule of thumb is use high value when creating the basic mass form.
spread iteration : higher number cause the river run longer
  • Thermal erosion : rock becomes sand
cut angle : control where erosion start, low value allow more erosion, higher value create cap on the mountain

advanced Tab
  • Hydro Erosion

removal rate : remove the debris by wind. set to minus can add more debris. It is useful to create muddy effect.

max debris Depth : allow thicker debris build over time.
Grid Bias : Control the direction of erosion

Erodability
initial factor : this increase the strength of erosion 
slope factor : low value enable flat area has erosion, higher value constrain the erosion in sharp slope area.

Riverbed
erosion rate factor : higher value cause deeper cur deep in the river. less value cause scattered isolate lake effect.
deposition rate : higher value cause earth hard to erode.
sediment capacity : how long can sediment go with the water. cause water go further in narrow area, also create mud in flat and wide area.
RiverBank
erosion rate factor : high number cause more erosion on river bank. wider river bank
Max Bank to Bed Water ratio : how much is consider as bank

thermal erosion
has the similar set as hydro erosion

Preciptation
this is where the rain wash down the debris. hydro erosion will need this to take shape.
amount : higher number more rain
density : higher number detail branches on the water path

raindrop setting
expand radius : channel and water area becomes larger in larger value

Debris flow
spread iterations : higher value make debris spread wider
quantization : higher value cause debris to be more chunky
water absorption : the higher value create a muddy and narrow water bank. lower value create wider and flater area.
max height : how high the debris can climb up to the mountain when it is washed down.
repose angle : the angle debris can stand until it falls down


layer
layer is options for multi-layer erosion, either delete or keep previous erosion effect.
remove layer and delete layer :
remove layer still keep the water / debris but they do not calculated in erosion, this is good for adding details on the second layer of erosion
delete layer : useful for when want to keep the form of previous erosion but create data set separately.
add layer to final height : bake the layer to the heightfield

heightfield remap
remap the height of a terrain, normally it will become more realistic.

heightfield blur can be use to blue the heightfield after erosion to create realistic result.


normal noise and chbyshew workey noise to noise but also hard line on top of the mountain, the size of the noise is the key to achieve good result.


heightfield resample
increase the resolution of heightfield
useful when finish the first layer of erosion and before go to the second layer.

heightfield



heightfield_slump


heightfield_flowfield
add the existing flow into the dataset.

Sunday 13 January 2019

shadow

shadow map theory

shadow map is used to calculate the shadow in real time. 
There are two passes used in this technique

pass 1 , in vertex shader, a z depth is rendered through light view. Pixel shader returns 0. Normally this use render to texture technique, Unity will do it for us here.
pass 2 ,  find out if a pixel is in shader by comparing the distance between the pixel and light and the z depth rendered in 1st pass. Normally shadowmap created from 1st pass will be a sampled here.

shadow acne


one problem of shadow map is shadow acne. 
to calculate if a point is in shadow, it compare its distance to light and the light view depth map.
As the resolution of the light map is limited the camera render pixel can not fully align with lightmap pixel.  When the position request from camera is slightly in front of the light view depth (in fact they should be the same), it will create shadow in the middle of the lit area.


To fix it set the shadow bias

shadow render in Unity

  • shadow casting pass need to be add
LightMode need to be ShadowCaster

Include file "Shadow.cginc" and add the shadow case code in it.
vertex shader is all it needs for shadow caster pass, pixel shader will return 0

a basic process needed is transfer the vertex into clip space.
UnityClipSpaceShadowCasterPos(pos,nor); is used to support normal map
UnityApplyLinearShadowBias(pos); is used for fixing shadow acne
  • Shadow receive pass
multi compile the shadow mode in the light pass that cast shadow:
allocate the memory for shadow map in vertex shadow
SHADOW_COORDS(coordinate number)
try to get the UV coordinate for the shadow map in vertex shader
REANSFER_SHADOW(OUT)

the shadow will be add into the light automatically when using 
UNITY_LIGHT_ATTENUATION

Thursday 3 January 2019

Houdini - VEX notes


  • Houdini process vertex the similar way as shader

vector ppos = @P;
@P means current point.
@ means either global variable or attribute.


  • use index to get the input of a node , from left to right 0, 1 , 2


It is the opinput in the function
example :



  • write to the attribute

f@distance = distance; // the attribute is a float
v@flow = displacement; // the attribute is a vector


  • add attribute
V@flow = set(0,0,1);
this will add a vector attribute to the current vertex.

Tuesday 1 January 2019

Unity Shader reference Notes - Multi Light

multi light is done in different passes.

base pass


  • First directional light is in the base pass
  • base pass is drawn no matter if there is a directional light.
  • declare in Tags before CGPROGRAM after pass
  • vertex light need to be in base pass , declare with VERTEXLIGHT_ON

          

add pass

  • Enable the blending between base and add pass by Blend One One
  • support directional / point / spot light  : multi_compile_fwdadd

          

include file

include file aim to reuse code include all the vertex / fragment function code and some variables declaration and includes files.
Include is just simple replace all the code in the included file to the including file.
  • a typical including declare is like this. It prevent include the same files twice.

          
  • include file ends with "cginc"
  • base pass and add pass can use the same include file

support different lights

  • according to different light type calculate the light.dir differently
  • UNITY_LIGHT_ATTENUATION(atten , vertextOut , vertextOut. worldPos ) provide the attenuation for point and spot light, it will always be 1 for directional light.

          

vertex light

  • four vertex lights can be used. When pixel lights reach the set limited, Unity switch to vertex light.
  • to set the pixel light limitation : edit -> project setting -> quality -> pixel light count
  • declare in base pass   #pragma multi_compile _ VERTEXLIGHT_ON
  • In vertex out structure declare vertexLightcolor to pass the vertex light to fragment function
  • light.
          
  • calculate the vertex light using the function ShadePointLight()
          Unity records the vertex lights position, color and attenuation in the following variable


  • vertex light is put into indirect light

          
One thing on vertex light is very important
To force vertex light , put the vertex light to not important render mode in light setting

environment Light

  • environmental light is captured by spherical harmonic function in Unity.
  • the function should only be used in the base pass
  • the function should contribute to the indirect diffuse light.