Azimuthal Projections 3D points to 2D map / uv

I’ve got an issue (related to my other post http://tech-artists.org/forum/showthread.php?4869-Complex-Lens-Shader which I’ve got 0 response to… I do not believe it is possible to do what I needed to here, so my next method to solve the problem is…)

I’m trying to build UV’s on an object based on a azimuthal projection. In short equi-angular spaced uvs. In my head the math was simple but its proving elusive. I’m sure its just a misapplication of some math function acos asin atan2 sin cos etc… but it has been many years since I’ve really done this kind of math (not that I was great to begin with).

With the projection point at the center of a hemisphere(180 degrees), I want to map out the vertices of that sphere into a uv map like this (left side) .

I thought getting the normal to each vertex would work…
pseudo code (assumes cam is center of hemisphere, facing +z)


normal = (vtx - cam).normalized
u = normal.x
v = normal.y

But this only results in a planar z projection where the spacing is denser at the edges.

I’ve tried getting the angle of a vertex in x and y axis from the pole (+z) of the dome, then put those angles into u and v… That results in equal spacing on x and y axis but a flower like pattern when off the xy plane?

Ultimately I’m not going to be projecting from the center of the dome. I want to be able to move my camera and reproject my uv’s, so a solution needs to work for any position.
This will be used to transfer the render of a 180 fisheye (equi-angular) lens onto a specific surface without distortion when seen from a non centered point of view.
My previous post image may be of some help to illustrate this?

Any guidance in the right direction would be helpful.

Can’t wrap my head around the math either right now, but I know that the projection issue is common in encoding normals in the gbuffer, when using only 2 components.

Take a look at these - they might do what you need:
A Bit More Deferred Cry Engine3 | PPT (slide 13)
Khayyam: Encoding normals for Gbuffer

Are you ok with the projection being squashed in 3-Space? I don’t think there is an authalic (area-preserving) polar projection, so you would have to choose between seams or stretches.

I think you can linearize the radial positions by squaring the components you got from the normals. A point at .707, .707 would correspond to 45 degrees of yaw and 45 of azimuth, and squaring it would be (.4999, .49999) which is what you want for 'halfway to the horizon in both directions)

UPDATE:

Doh, that’s not right: 30 degrees – one /3 of the way to the horizon – is .25 not .33. So no…

Thank you both for your replys… Between both posts it led me to the right direction…
Then I did a search for plotting angle and distance and got this page

My solution works for camera facing z+. And an object that already has a uv map (any will do… just make a planer projection first)
The following code is in maya (python), and uses my own Vector3 class for some of the common math ops such as Angle, Dot, ± etc…


import math
import maya.cmds as cmds
import Vector3                    # my custom class

cameraNode = 'myCamera'           # name of the camera
obj = 'myObject'                  # name of the object to uv map
fov = 180                         # field of view for projection in degrees


m = cmds.xform(cameraNode, q=True, m=True, ws=True)                   # get world matrix of camera
f = Vector3(*m[3:6])                                                  # get the forward vector of camera (useful in future when camera isn't oriented to +z)
c = Vector3(*m[12:15])                                                # get the position of the camera in world space

# Get objects vertices and loop
vCount = cmds.polyEvaluate(obj, v=True)
for i in range(vCount):
    vtx = Vector3(*cmds.pointPosition('%s.map[%d]' % (obj, i)))       # get vertex position
    n = Vector3.Normal(c, vtx)                                        # get cam to vertex normal
    o = vtx - cam                                                     # get cam to vertex offset
    r = Vector3.Angle(n, f) / (fov *0.5)                              # r = radius (gets the angle from forward vector to point vector)
    
    theta = math.atan2(o.y, o.x)                                      # get angle in screen space -180 - +180
    u = (-r * math.cos(theta)) * 0.5 + 0.5                            # circle formula x with offset to make uv 0-1
    v = (r * math.sin(theta)) * 0.5 + 0.5                             # circle formula y with offset to make uv 0-1
    

    cmds.polyEditUV('%s.map[%d]' % (obj,i), r=False, u=u, v=v)        # set the uv position

The next step is to get point positions from world to camera space within maya so my camera can be rotated away from +z…
If I were using directx this would be a built in to the matrix classes.

Anyone have an approach that will work within maya and python? (i’d prefer not to write my own)

You probably want to do this with pymel so you can do it with a matrix without having to write your own matrix class. You can probably just use the transform matrix of the camera, which should be orthonormal and consistently oriented. Pymel also has a vector class of its own , which simplifies the math

Haven’t used pymel… I always thought it was just a more convenient wrapper to the maya.cmds module… didn’t know it had all the matrix and math ops in there… Thanks.

[QUOTE=Theodox;25208]You probably want to do this with pymel so you can do it with a matrix without having to write your own matrix class. You can probably just use the transform matrix of the camera, which should be orthonormal and consistently oriented. Pymel also has a vector class of its own , which simplifies the math[/QUOTE]