The FForge one is slow and not terribly precise. (I’m using a series of tweaked
min/max functions.)
The PixelBender implementation is much faster, much more precise but very
unstable. Instead of checking a 256x256 block around each pixel, I can only go
with 116x116 before my display driver crashes. From Photoshop, the same
code goes south even at 36x36 iterations. o_O
116 shades of gray seems to be enough if you rescale the 1K result to 64x64,
but crashing PS… that’s a real problem. And I’m not even sure if the loops work
on ATI cards.
Unfortunately now I don’t have time to learn C++ and VC.
However, I was wondering…
/me starts thinking out-loud.
So PixelBender crashes beyond 116x116 loops. Can I use the same amount of
loops more efficiently?
Right now its brute force as I check a rectangle shaped region around the
target pixel:
…and so on.
(Imagine that its not 5x5 but 255x255.)
But I don’t need to check the corners as they are too far away, only stuff in
a 127 pixel radius will produce useful distance data.
[]
[][][]
[][]><[][]
[][][]
[]
That’s 22% less pixels to check. Better but still not enough for full precision.
If I start testing at the close neighbors of the pixel then I could get the
shortest valid distance early on and skip everything farther out.
9]
8]1]2]
[]7]><3][]
6]5]4]
[]
This would make the crash content dependent: too big contiguous black or
white area could produce too many loops.
Alternatively, a multipass algorithm could help, which stops just before the
crash. The filter then called again for the result so it starts over continuing
the job. Rinse, repeat until its done. Hmm…
It would still be great to know why the driver crashes in the first place… I
tried running the filter on CPU but I got bored and killed it when nothing
happened for 10 minutes with all four cores were maxed out.
I was able to go full precision on a 1K image, but only from inside PB, PS still
crashes at an earlier point (~10% less iterations).
It’s content and resolution dependent but there is an indication: when I can
count to 3 between updates then I’m closing in on the crash.
I’m gonna ask Adobe about this, but until then I’d appreciate any feedback:
how, when does it crash, with which graphics card, which driver, etc.
I just used your filter on a 1024² image at 128 pixel distance and it worked like a charm.
Only when I turned the “process on GPU” flag off/on did I run into crashes.
I’m running on a NVIDIA GeForce GTX 260 with Photoshop CS4.
Very usefull filter indeed!
edit1: I only wish that I could go beyond 128 pixels in range - after downscaling the lack of precision shouldn’t be a problem any more.
edit2: Another thing I noticed is that pixesl beyond the bounds of the image seem to be defaulted to black - so an image with white border will have a gradient at its borders. (see attachment) I think extending the last pixel beyond the border would be the correct behaviour.
I also looked into solving the issues with samples outside the image bounds.
The sample function in pixel bender cannot be tweaked to clamp the texture coordinates, like in HLSL for example.
Also I found found no way to figure out the size of the image to clamp the coords myself.
Can it be that such an advanced scripting language doesn’t support such a basic operation?
:o
Yeah, I kinda expected to have “repeat” or “clamp” tiling modes or something,
yet everything is 0,0,0,0 beyond the border. :\ Making seamless distance fields
needs more manual work this way.
In this version you can select an area to process using 64 pixel steps. (The
default selection covers 1024x1024 pixels.) When you’re done with all the
adjustments then turn the “SelectionPreview” flag off.
Now one can process images of arbitrary size using full precision: select part of
the image, small enough so it doesn’t cause a crash, apply the filter, select
another area, apply, rinse, repeat.
I hate to revive an old thread, but I thought this might interest you.
I created a pixelbender shader to fill empty pixels with the color of their closest opaque neighbor - as a post-process to baking textures and I ran into a few holes and noise.
After looking at the code in more detail, it turns out that growing the bresenham circle pixel by pixel, will not necessarily cover all the pixel inside of the circle.
The result is some subtle noise in the distance field, which becomes more apparent, when doing other things with the distance.
this image shows the noise in my edge expanding filter:
orthogonal edges (right) work fine, but anything angled (left) gets noisy
this image show a a crop of a distance field from your filter - a bit enhanced on the right:
I thought about ways to fix this, but the only thing I could come with is sampling in a growing rectangle and discarding far away samples early. It would be more expensive than now, but not miss any pixels.
Hmm interesting. I noticed the artifacts caused by single white
pixels but missed the subtle noise. (I didn’t bother debugging,
just applied a despeckle filter.
Going with a growing rectangle doesn’t sound that bad now
that an image can be processed in multiple passes. I’m not sure
when I’ll be able to tackle this so you’re probably better off
fixing it at your end for now.