Hi there, i am wondering is there such thing as morphing UVs. Example like a character have 1 UV. This character will morph to something an alien. However from this intensive morphing the initial UV that is on this character has been distorted badly. By keeping the UV seams the same as the original UV. I tweak the UV to fit the alien morph to have the 2nd set of UV.
My animation will be capturing from a character morphing to an alien. Can i morph the 1st set of UV (Human character) to my 2nd set of UV (Alien character)?
My thought is, since all vertex numbers on the character are the same, UV id or whatever it is being called should be the same. So it can be moved or animate in UV space.
That’s correct the vertex IDs are the same.
So you’ll need to interpolate using something like lerp function from UV1 to UV2 depending on the morph amount value which should be 0-1 range.
Though we’ll need more information about wether you are making it in real time or a rendered movie offline in a DCC. Cause the way you approach this problem is entirely different. It could be done on the CPU or on the GPU for realtime purposes, for rendering purposes such as in maya could be done in a plugin or with an expression I guess, not sure what the best way would be.
You can even lerp from mesh UV coordinates to some other way of mapping textures to create some interesting effects (some vector, world coordinates, etc.). You can also use a texture itself to create some interesting effects, besides just changing to a different UV set.
Thanks for the tips guys:) I want it to do in realtime in game engine like UDK. However it will be cool to learn it for rendering too:) I’m using 3d studio max though. But i think the idea would be similar.
Hi guys i am trying to find ways of lerping between channel 1 and channel 2 but i couldn’t find any way to do that. I’m using 3ds max 2009. Can any help me on this? Thanks:)
Okay for making it in realtime, I’m not sure how this is done in UDK. But I can give tips for shading languages.
A way this could be done is during the vertex shader in HLSL / CGFX / GLSL.
Just use this new texture coordinate when sampling a texture: float2 texcoord = lerp(IN.texcoord0.xy, IN.texcoord1.xy, morphAmount)
I know UDK has a lerp node in the material editor but I’m not sure if you can access the separate UV sets, I guess you can, never tried because I’ve only used that editor twice. Besides I’m not sure how global variables work in UDK because I guess you want to keep morphAmount synced with the amount of morph that the gameplay has done. You’ll need to ask an UDK specialist ^^
In UDK, just grab two “TexCoord” nodes (or TextureSampleParameter2D if for a parent material), grab a Lerp node (hold “L” and click in the shader window). Change one of the texCoord nodes to use Coordinate Index 1 and leave the other one at 0.
UDK’s UVs, materials, etc. are thankfully 0-based, so if it’s UV channel 1 in Max, it’s UV channel 0 in UDK.
Then, plug in some float value into the “Alpha” part of the Lerp node.
Make sense?
The “float” value can be some triggered invent from Kismet, IIRC. (from 0 to 1, for example).
You would normally just plug the result into the texture coordinate slot of the texture sampler of course, just plugged it into the diffuse slot for a visual of the resultant UV data.