Separate names with a comma.
Discussion in 'Large Format Cameras and Accessories' started by RalphLambrecht, Feb 3, 2012.
yes, i know what it isbut how is acutance actually mesured?
The acutance tests Ive seen illustrated in books show a thin knife-edge mask laid over a film and exposure is made over the edge and the film developed. Afterwards the film is examined with a microscope to determine how much the exposed and developed areas have spread out to darken the area under the mask.
In theory the area under the mask should have no density. In reality it always has some developed image density in the area under the mask and close to the edge.
Those film and developer combinations that produce the least spread in the masked area are judged to have relatively high acutance. Film and developer combinations that produce more spread are judged to be of relatively lower acutance.
Without consulting some of these articles I cant comment on the precise method of quantifying the differences in acutance other than to characterize it as a measure of the unintended density in the film on the wrong side of the knife-edge mask.
The 3rd paragraph here might be useful.
Possibly by contacting someone at Ilford you might get a more useful answer in terms of how acutance is actually measured and quantified.
Interesting. I didn't think you could objectively quantify acutance since it is somewhat a function of adjacency effects in addition to the actual contours of the grains and/or grain groups.
I would have thought the only thing you could really "measure" would be the relative sharpness or uniformity of the edge of each developed silver grain or filament or whatever. I think of this as the lowest measurable level of all the successive factors/variables ultimately combined to give an overall impression of acutance.
So in other words I picture a kind of tree diagram starting with silver grain contour sharpness/uniformity, which presumable could be measured objectively without the influence of other variables. Then you start piling on all the other variables which combine to give an impression of relative acutance, then add things like granularity and contrast to get an impression of sharpness.
Just some thoughts I'm throwing out there.
Perrin of Kodak defined acutance as (1)the average slope of the density-distance plot from a knife edge negative.
"Acutance" as generally used is somewhat different,it means (2) the sense of sharpness taking into account adjacency effects.
"Controls in Black and White Photography"by R.Henry details how (1) is measured with a microdensitometer.
There is no means of obtaining a numerical value for (2).
Damn! When the "Guy Who Wrote The Book " on the subject asks about a synthetic detail,
discussion certainly becomes a whole lot more pointed and respectful. Congratulations APUG,
and PLEASE let's continue along this same vein.
I think of acutance as the derivative (slope) of negative density across a sharp boundary that should reach theoretically DMax on one side and DMin on the other side of the boundary. So you put a sharp edge (knife) on the film, expose, develop, and see what you get. If everything were perfect, you should get something like a step function, but in reality you get something with a smooth slope because the materials don't have perfect resolution, there is some smear effect from developing, and there's probably also some light leakage along the edge too.
So in other words you just want to know how much the film and the development smears out what should be a very abrupt change of tone from one extreme to the other.
So my way of thinking seems to agree with Ian's very closely.