I was wandering in the forest when I arrived at a clearing
with a great view to a neighboring hillside. The dense
foliage was lit by the setting sun, producing an
aesthetically pleasing sight. The low angle of the light
accentuated the complex, visually interesting nature of
that cloud of leaves and branches flowing down the slopes.
My TA lobe kicked in and I started wondering about the
following:
Could we estimate a normal map based on timelapse photos
of an outdoor subject?
Let’s assume that…
… the subject doesn’t move or change shape. (So no wind
shakes the leaves or we deal with a rock for example.)
… the weather is good, no clouds.
… we know the exact date and time for each picture.
… we know the exact location and orientation of the camera.
… we know the approximate location of the subject.
We take several photos throughout the day without moving
neither the camera nor the subject. When it’s done then we
follow the brightness changes of every pixel: when it’s the
brightest then it was facing the sun the most. We can
compute the position of the sun based on date, time and
geo-location. The approximate orientation of the subject
could also be factored in.
When the brightness suddenly changes then we can assume
that the area got in or out of a shadow. We only care if the
pixel is directly lit so if shadows come and go then we
interpolate through the shadowy gaps.
The color of the sunlight can also be computed so the
surface color can be compensated for that.
So that’s it. Thoughts?