alanrockwood
Member
- Joined
- Oct 11, 2006
- Messages
- 2,184
- Format
- Multi Format
The Nyquist sampling theorem says that it is possible to exactly reconstruct a bandwidth limited image if the image is sampled at a rate greater than the Nyquist limit. The Nyquist limit is twice the frequency of the highest frequency in the image. Here we are referring to spatial frequency, and "bandwidth limited" means that there is a frequency limit above which there are no frequency components in the limit.
Sometimes we misunderstand the implications of the Nyquist sampling theorem. While it does say that it is possible to reconstruct the image (i.e. the signal) without error, it does not say that the sampling itself is produces exact reconstruction of the original image. Let me illustrate this with a simple one-dimensional example.
First we have a simulated image as it would exist prior to sampling.
By the way, this image is a simple sinusoidal function (a cosine function).
Next we have the result of sampling this image at 1.1 times the Nyquist limit. This sampling rate is fast enough that in theory it is possible to exactly reconstruct the original image in this post.
Note that this is a considerable distortion compared to the original signal. Rather than being a simple sinusoidal function that reconstructs the original function, it shows a strong beat pattern. This definitively shows that the sampled image itself does not necessarily accurately reconstruct the original signal, even if the sampling rate satisfies the Nyquist theorem. It would require a more sophisticated calculation to accurately reconstruct the original function from the sampled result. Simply taking the sampled function itself as the reconstruction is not sufficient and is prone to error.
I could add some other interesting examples showing how things get better if the sampling rate is even higher. However, to keep this simple let me defer doing this and just discuss the implications of what I have shown above.
So, what are the implications? Well, in discussions of resolution of scanned images the topic of the Nyquist limit often arises and it is stated that the sampling rate needs to be above the Nyquist limit if the scanner is going to resolve a repetitive image (lines). This is true, but it doesn't fully capture the importance of sampling at an even higher rate because if the sampling rate is only slightly higher than the Nyquist limit then repetitive features in images may be significantly distorted if the sampled result itself is taken as the image reconstruction (which it is not.) Higher order analysis must be done in order to provide an accurate reconstruction. Since few if any scanning systems today attempt to do more sophisticated reconstructions it is important that the sampling rate be significantly higher than the Nyquist limit to assure that the scan provides an accurate reproduction of the image.
By the way, a similar analysis applies to images acquired by digital cameras.
Sometimes we misunderstand the implications of the Nyquist sampling theorem. While it does say that it is possible to reconstruct the image (i.e. the signal) without error, it does not say that the sampling itself is produces exact reconstruction of the original image. Let me illustrate this with a simple one-dimensional example.
First we have a simulated image as it would exist prior to sampling.
By the way, this image is a simple sinusoidal function (a cosine function).
Next we have the result of sampling this image at 1.1 times the Nyquist limit. This sampling rate is fast enough that in theory it is possible to exactly reconstruct the original image in this post.
Note that this is a considerable distortion compared to the original signal. Rather than being a simple sinusoidal function that reconstructs the original function, it shows a strong beat pattern. This definitively shows that the sampled image itself does not necessarily accurately reconstruct the original signal, even if the sampling rate satisfies the Nyquist theorem. It would require a more sophisticated calculation to accurately reconstruct the original function from the sampled result. Simply taking the sampled function itself as the reconstruction is not sufficient and is prone to error.
I could add some other interesting examples showing how things get better if the sampling rate is even higher. However, to keep this simple let me defer doing this and just discuss the implications of what I have shown above.
So, what are the implications? Well, in discussions of resolution of scanned images the topic of the Nyquist limit often arises and it is stated that the sampling rate needs to be above the Nyquist limit if the scanner is going to resolve a repetitive image (lines). This is true, but it doesn't fully capture the importance of sampling at an even higher rate because if the sampling rate is only slightly higher than the Nyquist limit then repetitive features in images may be significantly distorted if the sampled result itself is taken as the image reconstruction (which it is not.) Higher order analysis must be done in order to provide an accurate reconstruction. Since few if any scanning systems today attempt to do more sophisticated reconstructions it is important that the sampling rate be significantly higher than the Nyquist limit to assure that the scan provides an accurate reproduction of the image.
By the way, a similar analysis applies to images acquired by digital cameras.
Last edited: