By using this site, you agree to our Privacy Policy and our Terms of Use. Close
drdante said:
Tachikoma said:

Keep at it and let us all know how it goes!


 

Of course i will update my progress....that is when i've achieve something lol

Right now i'm still learning your tutorial and watching youtube so this may take a while.

Wether you are shuffling forwards or taking giant strides, forwards is still forwards!



Around the Network

-snip



Tagging.



My dream job :')

( I'm really just tagging, but I don't have anything interesting to day )



Tachikoma said:

[...]

I am essentially writing a new normal mapping shader that allows multi stage normal map passes that interact with each other, nornally the normal map tells the render pipeline how to light a material based on the material normal map definition alone, the problem with this is, while it delivers a nice uniform bumpy surface, the surface itself looks flat overall, the only CURRENT workaround to this is to physically model deformities into the surface.

What I'm doing however is taking a low level normal map, and passing its light calculations on to a second normal map, which effect the base calculations of this second map, the result is that the flat surface responds in a much more organic manner to light.


[...]

The advantages of this are signficantly reduced resource requirements over actually modelling uneven surfaces, and full control over deformation of flat surfaces through shader control of the two stages of normal mapping, a hide defined process can also be added to make the applied texture change depending on the calculated high of the bump math, making it possible for dynamic weather effects to weather the mapped area realitically to the perceived geometrical structures.

To explain this roughly.

The first line is a standard normal map, light is calculated and bounced based off of this normal map, but light hits the entire surface and thus is calculated based on this coverage.

Second line shows combining large normals shapes with the detail map, you end up with a bumpier more varied map, but light coverage is still uniform across the whole texture, detracting from the effect.

Third shows my process, the red line represends the initial normal map pass, which passes its calculations on to the black line which is the detail normal, this allows light computation from the lower map to effect how the next map is lit, in effect giving you the ability to have calculated light effect bumps that cast shadow on surrounding texture themselves, creating a much richer and organic light coverage.


Wow, very cool! Once you decide the standard normal map and the second level one, can you precalculate the overall operator of the two stacked transormations, so that once done it, lighting can be calculated in a single pass?
I guess that your system can be used also to simulate minor changes in a game with modifiable environment and objects, modifying the detail layer.



Stwike him, Centuwion. Stwike him vewy wuffly! (Pontius Pilate, "Life of Brian")
A fart without stink is like a sky without stars.
TGS, Third Grade Shooter: brand new genre invented by Kevin Butler exclusively for Natal WiiToo Kinect. PEW! PEW-PEW-PEW! 
 


Around the Network

-snip



Tachikoma said:

[...]

I have it setup to pass variable parameters that can be accessed and changed per surface, the first is the texture coordinate, the second is the intensity, the third is the low, mid and high range of calculated values for the height variant, with that I can apple additional shader math to do things like puddle formation for dynamic weather, where refraction and blur are used to simulate water, as below.

As the refraction is done based off of the height data of the normal itself, it can be adjusted in depth and the intensity of refraction changes as the depth of the water changes, i bumped up the refraction value to non-realistic levels here to make the effect clearer.

In that same ilk, the upper limits or light effected faces can be used to texture paint rather than illuminate, allowing for snowfall to accuminate within the cracks/channels, or pass the height data on to particle effects and have things like sparks interact dynamically with the perceived geometry.

The shader also reports the angle of the material relative to the horizon within the paramters, for both static and dynamic objects, so it would be possible to have things like water/snow slide/run off the surfaces, not just sit on top.

All the while the surface is actually completely flat, as shown in the above image.

Oh, it does a lot more than what I initially understood. I guess dynamic weather and other dynamic features cannot get a further simplification precalculating and storing the overall transform operator, but as you explained, still the big simplification given by the actually flat surface and consequent saving on polygon count applies.



Stwike him, Centuwion. Stwike him vewy wuffly! (Pontius Pilate, "Life of Brian")
A fart without stink is like a sky without stars.
TGS, Third Grade Shooter: brand new genre invented by Kevin Butler exclusively for Natal WiiToo Kinect. PEW! PEW-PEW-PEW! 
 


Re-tagging.



Stwike him, Centuwion. Stwike him vewy wuffly! (Pontius Pilate, "Life of Brian")
A fart without stink is like a sky without stars.
TGS, Third Grade Shooter: brand new genre invented by Kevin Butler exclusively for Natal WiiToo Kinect. PEW! PEW-PEW-PEW! 
 


-snip



Oh, didnt know about this thread before, but is very interesting.