Improving on sharpening using unsharp function(s)

alanrockwood

Member
Joined
Oct 11, 2006
Messages
2,189
Format
Multi Format
I have been giving a lot of thought on how to improve sharpening functions, specifically on how to sharpen while minimizing halo formation. I focused on using unsharp masking functions and have done some computational experiments.

Here is what my experiments are showing me. For a given amount of sharpening it is better to sharpen several times using narrow blur functions than one time using a wide blur function. This figure gives an example.




What the figure is showing is using successive unsharp functions using a narrow blur function produces less overshoot than using a single wide function.

Specifically, when applying narrow operations with narrow blur multiple times, the peak of the overshoot is located closer to the center edge being sharpened and amount of overshoot is less compared to using a single unsharp operation using wide blur. This means that the halo will be less apparent.

Also the slope in the transition region is steeper when using multiple narrow blur functions.

Overall, one can conclude that it is probably better to apply successive unsharp operations multiple times using narrow blur than a single unsharp operation using wide blur.

I can post a more detailed explanation of the calculations if anyone is interested.

Any thoughts?
 
OP
OP

alanrockwood

Member
Joined
Oct 11, 2006
Messages
2,189
Format
Multi Format
OK. I did another study.

Here are the details. I started with a gaussian point with a variance of 16 units.

I did an unsharp mask procedure using a gaussian blur function with a variance of 8 units, applied once.

I did a second unsharp mask procedure using a gaussian blur function with a variance of 1 unit, applied 8 times.

(These are not the same parameters I used in the previous study, and for presentation purposes I am using an original function as a blurred point rather than a blurred edge.)

The resulting widths of the two sharpened functions are the same, as measured by the second moment, which ended up being 8 units. The second moment is basically the same thing as the standard deviation for these, except for a technical issue, which is that the sharpened functions are not, strictly speaking, probability functions, because they have undershoot in the tails that goes negative. That technical detail is not important when comparing the widths of the functions. Areas of the two peaks are also the same, as measured by the integrals of the functions.

The important observation is that, even though the degree of sharpening is the same using the two sharpening procedures, the second procedure produces a better result because there is less undershoot, (the degree of undershoot is less, and it occurs closer to the center of the peak), the slope of the curve is steeper, and the peak height is greater.

This implies that when sharpening you will produce fewer sharpening artifacts (like halos) if you sharpen multiple times using a narrow blur function than if you sharpen once using a fatter blur function.

I remains to be seen how well these results hold up when using different functions. For example, if the blur in the original image is not gaussian, and also how the width of the blur in the original function compares to the blur used to sharpen the picture in the unsharp masking procedure.

 
Last edited:

albada

Subscriber
Joined
Apr 10, 2008
Messages
2,175
Location
Escondido, C
Format
35mm RF
If you want to pursue sharpening more deeply, I suggest learning how to create plug-ins for Gimp. Then you can sharpen any way you want. In the 1990s, HP printers had an internal feature which engineers called the "variable sharpener" designed to avoid halos by varying the sharpening-factor based on I forget what. It worked well, and you might find the algorithm in patents. Anyway, I encourage you to experiment with sharpening algorithms.

Mark
 
OP
OP

alanrockwood

Member
Joined
Oct 11, 2006
Messages
2,189
Format
Multi Format

Thanks for the suggestion. I am not sure that my programming skills would be up to the task of writing plugins for Gimp.

To discuss things further, from a theoretical perspective the very best way to sharpen is to apply a deconvolution function tailored to the point spread function of the image.

If one has somehow obtained the point spread function that is responsible for loss of sharpness in the image, then the one can take the Fourier transform of the point spread function and use that to find out which spatial frequencies to tweek (enhance) to sharpen the image.

If one then takes the Fourier transform of the actual image and then tweeks the spatial frequency components of the transformed image function according to what was found in the previous paragraph, and then does an inverse Fourier transform back to image space the result will be a sharpened image.

One advantage of this approach is that it does not assume anything about the functional form of the point spread function or the blurring function used in an unsharp masking procedure. For example, it doesn't assume that those will be gaussian functions.

However, this is not necessarily an easy way to do it for a number of reason. The first is that the user probably doesn't actually know what the proper point spread function is, so in practice there will probably be a mismatch in the functions to use in the procedure.

A second reason is that 2D Fourier transforms involve fairly lengthy calculations, so the process using FTs will probably be slow. One way around this might be to do a direct deconvolution (which can be implemented in the form of a convolution) using a localized sharpening function. That method would only need to look at small parts of the image at a time. It might be faster, though I couldn't say for sure. However, this relies on the deconvolution function to be localized do it doesn't require too much computation to do the deconvolution.

A third reason is that the point spread function is probably not the same over the whole picture, so likely only part of the picture will become well-sharpened. The rest will be under-sharpened, or maybe even over-sharpened. This is actually a disadvantage using unsharp procedures as well.

One thing to keep in mind is that any sharpening method is going to increase noise in the image. For example, it will accentuate graininess and scanner noise. This is of course well known, though I am not sure it is always fully appreciated.

Another thing to keep in mind is that, for technical reasons and as a practical matter sharpening can only be taken so far. It is unlikely that one could take a really blurry image and produce a really sharp image that doesn't have a lot of artifacts introduced during the sharpening process, such as funky boundary regions around points and edges (Halos are one example) and huge noise issues.
 
OP
OP

alanrockwood

Member
Joined
Oct 11, 2006
Messages
2,189
Format
Multi Format

Also, one reason I have been thinking about this is to figure out how to get sharper images out of a scanner like an Epson 700 or 800. I think some improvement is possible, but that will depend on how much high spatial frequency is present in the scanned image, and that will in turn depend on the quality of the lens in the scanner, as well as how well the scanner is focused. If there is a sharp cutoff in the spatial frequencies in the image then there is a hard cutoff in how much sharpening can be done.
 

albada

Subscriber
Joined
Apr 10, 2008
Messages
2,175
Location
Escondido, C
Format
35mm RF
Thanks for the suggestion. I am not sure that my programming skills would be up to the task of writing plugins for Gimp.

I found it wasn't hard. I suggest getting the code for a simple plugin from the internet, and replacing the algorithms with yours.

Yes, FFT/IFFT might be slow. For the frequency domain, I tinkered with smoothing+sharpening using DCTs 15-20 years ago. DCT is Discreet Cosine Transform, and is the basis of JPEG compression. Unfortunately, I don't remember what my results were. Probably unimpressive. DCT might be worth another look. In fact, a JPEG compress/decompress can be used to sharpen by using different quantization tables for compress and decompress. Wavelets work well, but their theory is abstruse. Smoothing+sharpening is interesting and challenging.

Mark
 

reddesert

Member
Joined
Jul 22, 2019
Messages
2,460
Location
SAZ
Format
Hybrid
PSF deconvolution is a problem that has large literatures in different domains - different kinds of image processing, remote sensing, astronomy, etc. The methods that work well in one domain don't always work well in another. Even in astronomy (which is what I know about), for example, radio astronomy and optical astronomy tend to use completely different algorithms because the properties of the data and the instrumental effects are very different.

2D Fourier transforms can be done reasonably efficiently with an FFT, much faster than direct convolution. Setting this up can be difficult.

Methods that are promising in theory often do not work well on real data, for reasons like spatially variable PSF, and noise in the input image. Deconvolution will tend to strongly amplify noise, and most methods need to apply some kind of regularization to avoid overfitting the image.

The undershoot (ringing) in the wings of the unsharp masking kernel is, I think, a known problem - putting power into the high frequency of the sharpened peak tends to produce ringing for some of the same reasons you get ringing in many types of signal processing. I believe that ways to deal with this include optimizing the smoothing kernels to be different from just two gaussians. However it's hard to make such artifacts go away entirely.
 

_T_

Member
Joined
Feb 21, 2017
Messages
421
Location
EP
Format
4x5 Format
So I tried what you suggested on a real photo and it didn’t really work. I used a blur dimension that just produced obvious halos and then divided that by 8 and ran it 8 times.

I didn’t get any sharpening of the actual features of the image. Just the grain was sharpened until it became terrible noise.

So I thought maybe I should try it with simple black and white blobs like you so I produced a block of dimension 8x and blurred it by 16x then sharpened it with a blur dimension of 1x 8 times and it didn’t get any sharper but instead I saw a lot of banding
 
OP
OP

alanrockwood

Member
Joined
Oct 11, 2006
Messages
2,189
Format
Multi Format

Let me clarify something. In the test, for the narrow blur function I scaled it so that its variance was 1/8 of the variance of the wide blur function. In terms of standard deviation that corresponds to the narrow function being 0.354 times as wide as the wide blur function. (It's a square root function, and the square root of 1/8 is 0.354.)

It might be worth trying that to see how that works.

I can't explain why you are seeing more noise with your multiple applications of your narrow blur function when compared to a single application of the wide blur function one time. However, Please, if possible, scale the narrow blur function the way I suggested in my first paragraph in this post.

If you decide to try that then please let us know how it works out.
 
OP
OP

alanrockwood

Member
Joined
Oct 11, 2006
Messages
2,189
Format
Multi Format

Let me ask a question about the blobs. Were they similar to boxcar functions? A boxcar function has a constant value within some interval (such as 8 points wide), and zero values outside the interval.

In an image, in order to avoid negative values, one would define the image by starting with a constant value everywhere, and then add another constant value within an interval (such as 8 point wide).

If you try to sharpen a boxcar function it doesn't work very well. Basically, you get big amounts of overshoot and undershoot (halos) at the boundaries of the boxcar function.

I have calculated an example. I am attaching three plot. The first is a boxcar function with a variance of 10. The second is a boxcar function after a single sharpening using a gaussian blur function with a variance of 5. The third is a boxcar function after two sharpenings with a gaussian blur function of with a variance of 2.5.



Basically, all the sharpening did was add halos. The halos on the one that was sharpened twice had greater amplitude but didn't extend out as far from the edges. (please excuse the typo in the figures, "vanriance" should be "variance". )
 
Last edited:

_T_

Member
Joined
Feb 21, 2017
Messages
421
Location
EP
Format
4x5 Format
I think you’re a bit caught up on sharpening blobs. It seems like you may have overfit your solution not just to sharpening blobs but to sharpening a specific kind of blob that isn’t particularly prevalent in real images
 

cliveh

Subscriber
Joined
Oct 9, 2010
Messages
7,588
Format
35mm RF
I don't wish to rain on your parade, but as this is a digital image, the sharpening you illustrate is an illusion of the original scene and therefore lacks any integrity to what you actually recorded.
 

reddesert

Member
Joined
Jul 22, 2019
Messages
2,460
Location
SAZ
Format
Hybrid
All photographs are interpretations. I've seen a lot of analog black and white photographs of scenes that were probably originally in color.

Unsharp masking was an analog technique before it was used on digital images - it was a lot harder to do of course, because the practitioner had to make a physical unsharp mask as a slightly defocused duplicate of the original image. It can be used for local contrast enhancement (high spatial frequency) while reducing global contrast (at low spatial frequency).

Alan, I think it would be useful if in addition to calculating simple gaussian models, you tried the experiment by using the sharpening filter implemented in Gimp, Photoshop or similar. Take an image and try to sharpen it two ways: either with one moderate filter or N applications of a filter smaller in radius by some factor (eg smaller by sqrt(N)). On real images, repeated sharpening tends to amplify noise and the resulting image is not pleasing.
 
OP
OP

alanrockwood

Member
Joined
Oct 11, 2006
Messages
2,189
Format
Multi Format

As you point out, sharpening increases the amplitude of high spatial frequencies in the image. (If it doesn't emphasize high frequencies there is no sharpening).

Sharpening also increases noise. This applies to all sharpening methods, including unsharp masking methods that use a broad blur function. The critical question is whether one broad blur function produces less emphasis of the noise than multiple unsharp masking steps using a narrow blur function, with the constraint that an equivalent amount of sharpening is to take place in either case.

I don't know the answer to that question, and I am prepared to accept the possibility that the second procedure (multiple applications of unsharp masking using narrow blur functions) might be more detrimental to noise profile in the image. If so then there would seem to be a trade off, less haloing with more noise.
 

_T_

Member
Joined
Feb 21, 2017
Messages
421
Location
EP
Format
4x5 Format
I think you just need to try it yourself on an actual photo because it doesn’t sharpen anything but the noise. There is no improvement in sharpness to the image and it pretty much just deep fries it
 
OP
OP

alanrockwood

Member
Joined
Oct 11, 2006
Messages
2,189
Format
Multi Format
I think you just need to try it yourself on an actual photo because it doesn’t sharpen anything but the noise. There is no improvement in sharpness to the image and it pretty much just deep fries it

If it only sharpens the noise then it means that there is not enough high frequency content in the underlying (noiseless) image that the image can be sharpened. There is no sharpening method that can faithfully restore spatial frequencies that don't exist in the image.
 

_T_

Member
Joined
Feb 21, 2017
Messages
421
Location
EP
Format
4x5 Format
Of course. Couldn’t be a problem with the method you came up with. That’s been thoroughly proven out. Bomb proof at this point. Must be a problem with images in general I guess.
 

bags27

Member
Joined
Jul 5, 2020
Messages
583
Location
Rhode Island
Format
Medium Format
Not to deflect this, but the references above to Topaz AI interest me. I've been using--and liking--Topaz Sharpen. I've heard that Topaz AI doesn't give a lot of room for user judgment and some prefer to stay with the standalones Sharpen and DeNoise (even though they will not be updated). Any thoughts?
 
OP
OP

alanrockwood

Member
Joined
Oct 11, 2006
Messages
2,189
Format
Multi Format
On the topic of noise increases during sharpening, the issue has come up about whether breaking sharpening up into repeated steps of a small amount of sharpening on each step produces a noisier result than a single step of a greater amount of sharpening.

I tested that by doing some simulations, and confirmed that (based on the simulations) a series of small sharpening steps produces a noisier result than a single step of a larger amount of sharpening.

For example, I created a noise file with RMS noise of one unit. Then I did an unsharp operation using a gaussian blur function with a variance of 4. The noise of the "sharpened" noise was 1.82 times as much as the original file.

Then I did four successive sharpening steps on the same original noise function using gaussian blur functions with a variance of 1. That result had 10.0 times as much as the original file. In other words, the second procedure enhanced noise by a factor of 5.5 times more than the first procedure. I must admit that this result came as a surprise to me, though in retrospect it should not have been

So we have situation where there is a trade of between halo and noise. One can reduce the halo in the unsharp procedure but at a cost of more noise.

So which is better? There is not a single answer to this. I think it all depends on how to prioritize noise vs. how much halo to accept. That in turn depends on who is processing the image as well as the properties of the individual image. I think it's good to understand that a trade off is available.

Thanks to the posters who pointed out the noise issue when doing repeated steps of small amount of sharpening on each step.
 
Cookies are required to use this site. You must accept them to continue using the site. Learn more…