is it possible in UDK to project a texture to a mesh’s shader from the player’s camera/any camera that is in use at the time?
If it is, I’d love to hear about a way to do it
You’ll need to use a render2texture node and parent it to your camera so it captures the same image, but it’s possible
[QUOTE=lkruel;6411]You’ll need to use a render2texture node and parent it to your camera so it captures the same image, but it’s possible[/QUOTE]
But… that creates a texture of what the camera sees, right?
What I need is not that, but the ability to project a ready-made texture from a camera to a mesh.
Is that possible? I’m really not familiar with all the nodes in the coordinates-section of the material editor :S
If it’s just for one object, you should be able to do it in a simple shader.
Basically you use the screen-space position of each vertex as its texture coordinate.
Pseudocode:
textureCoord.uv = (vertexPos.xyz*worldToScreenMatrix).xy
then you might wanna apply some more transformations.
This can be done in shader code - not sure if it’s possible with unreal’s shader graph.
[QUOTE=saschaherfort;6439]If it’s just for one object, you should be able to do it in a simple shader.
Basically you use the screen-space position of each vertex as its texture coordinate.
Pseudocode:
textureCoord.uv = (vertexPos.xyz*worldToScreenMatrix).xy
then you might wanna apply some more transformations.
This can be done in shader code - not sure if it’s possible with unreal’s shader graph.[/QUOTE]
Ok, well, I’m not a shader coder. I’m pretty much just a hobbyist right now, with an interest for these things
Anyway, there’s a custom-node in UDK material editor, which is meant for making completely customized effects in the material, if the user can code those effects in HLSL or CG, I guess.
I’ll try the code you gave and see if I can do something with it
I’ll be posting the results (or the lack of them)
hm… it keeps giving error
error X3004 undeclared identifier on each textureCoord, vertexPos and worldToScreenMatrix
any suggestions?
Yeah, that was pseudocode. i.e. not really code for you to use, just an illustration of the technique. There’s no way that is just going to work. You’ll have to find the proper way to do that in unreal.
[QUOTE=floatvoid;6442]Yeah, that was pseudocode. i.e. not really code for you to use, just an illustration of the technique. There’s no way that is just going to work. You’ll have to find the proper way to do that in unreal.[/QUOTE]
Ok, I just don’t think it’s possible with just the nodes… Is there any way you could try to think of a code that would work in a custom coordinate-node?
You can use a light function on a light to get what you’re trying to do.
[QUOTE=lkruel;6444]You can use a light function on a light to get what you’re trying to do.[/QUOTE]
projecting the image from a light that’s attached to the camera?
but that would affect the entire scene, not just the specific object
Ok, so let’s get back to basics.
What are you trying to do?
[QUOTE=lkruel;6446]Ok, so let’s get back to basics.
What are you trying to do?[/QUOTE]
I’m trying to get a texture projected from the player’s camera on to the surface of a mesh in the game.
Basically just fooling around, trying stuff and this one seemed like quite a challenge.
Is this like a UI element that gets mapped into a sphere? what specifically. Is this always camera relative?
you can map the pixelposition into the UV of the texture on your shader.
[QUOTE=lkruel;6448]Is this like a UI element that gets mapped into a sphere? what specifically. Is this always camera relative?
you can map the pixelposition into the UV of the texture on your shader.[/QUOTE]
I don’t really know yet, I guess I could try to use for something like that. honestly, I haven’t yet thought of a practical application of this for myself.
Let’s say it’s the UI-thing. UI projected on a wall in the game.
And yea, it should always be projected from the camera that is being used at the moment.
About that mapping-thing… well… do you know of a way to do it in UDK?
Yeah, you take a camera vector node, then you connect a component mask into it, then check R and G on it, then connect that into the UV of a texture sample
[QUOTE=lkruel;6450]Yeah, you take a camera vector node, then you connect a component mask into it, then check R and G on it, then connect that into the UV of a texture sample[/QUOTE]
Hm, this seems quite promising. Thank you, I’ll see what I can achieve with this
[QUOTE=lkruel;6450]Yeah, you take a camera vector node, then you connect a component mask into it, then check R and G on it, then connect that into the UV of a texture sample[/QUOTE]
ok, I tried this, but it doesn’t quite cut it.
The texture isn’t projected from the camera (eg. stretched to the camera’s aspect ratio and then mapped on to the object without tiling.
Here’s a pic, so you’ll see the problem it has. I tried it with a rendertotexture-target and applied the material made of it on a wall.
The thing is that the texture shouldn’t be affected by the shape of the surface, or its orientation in relation to the camera. The texture should simply look like it’s flat on the scene, but visible only where the mesh is, as if it was masked off by everything else
This is something you could try, it’s the same network as for spherical mapping but you swap the vector with the camera vector. Late reply but I hope it is what you’re looking for. :p:
Have you tried this?
1.) Create a ScreenPos node (Coordinates>ScreenPos). Make sure the “ScreenAlign” option is checked
2.) Create a ComponentMask node (Utility>ComponentMask). Make sure only “R” and “G” are checked (should be by default)
3.) Create a TextureSample node (Texture>TextureSample)
Connect output of ScreenPos to input of ComponentMask
Connect output of ComponentMask to UV input of TextureSample node
Connect TextureSample to Diffuse or Emissive (or wire into your network however you need)
[QUOTE=wes3;6866]Have you tried this?
1.) Create a ScreenPos node (Coordinates>ScreenPos). Make sure the “ScreenAlign” option is checked
2.) Create a ComponentMask node (Utility>ComponentMask). Make sure only “R” and “G” are checked (should be by default)
3.) Create a TextureSample node (Texture>TextureSample)
Connect output of ScreenPos to input of ComponentMask
Connect output of ComponentMask to UV input of TextureSample node
Connect TextureSample to Diffuse or Emissive (or wire into your network however you need)[/QUOTE]
almost there, now the problem is that the texture coordinates have to be multiplied by the right numbers to get the right aspect ratio to fill the whole screen.
This one works with the aspect ratio I have in the editor:
But not with the aspect ratio in the game. So the enxt job would be finding a node/combination that could get the aspect ratio of the actual player camera into that multiplier. Any ideas?