Methodology and Curve Interpretation

blossum in the night

D
blossum in the night

  • 1
  • 0
  • 26
Brown crested nuthatch

A
Brown crested nuthatch

  • 2
  • 1
  • 38
Double Self-Portrait

A
Double Self-Portrait

  • 7
  • 2
  • 139
IMG_0728l.jpg

D
IMG_0728l.jpg

  • 7
  • 1
  • 100

Recent Classifieds

Forum statistics

Threads
198,709
Messages
2,779,652
Members
99,684
Latest member
delahp
Recent bookmarks
0

aparat

Member
Joined
Sep 5, 2007
Messages
1,177
Location
Saint Paul,
Format
35mm
I forgot to attach a plot showing the two 0.3G speed point calculations. The red value is obtained by the "w method" (Nelson and Simonds, 1955) and the classic Jones approach is marked in purple. In some of the more distorted curves, the "w method" tends to underestimate the speed point exposure (and overestimate the obtained speed) by a little bit, as Nelson and Simonds show in this table. Overall, I am finding about the same agreement in my data as they did. One day, I will have to try the graphic calculator provided for the "w method." I wonder if it is any more accurate than the numerical method.
Fig5_Nelson_and_Simonds.png


delta100Plots03Gcomparison.png
 

ic-racer

Member
Joined
Feb 25, 2007
Messages
16,542
Location
USA
Format
Multi Format
Of the three methods, I was only able to do W speed in Microsoft Excel. Maybe more of a wizard in that environment could do more. However if I could get Excel to calculate 0.3G, I'd not bother with the others, since their only utility is ease of calculation.

I spent thousands of dollars on Think C, Mathematica, Data Desk, Super Anova, Aldus Freehanand, Code Warrier and other software tools getting my degree in the 1980s, but, of course none of it runs on my current computer. So I'm stuck programming in Excel which I have available on my work computer. Excel is so stupid hard to use it is useless at work; no one uses it.

Looking forward to the release of your software if, indeed, you are going to make it available!
 

aparat

Member
Joined
Sep 5, 2007
Messages
1,177
Location
Saint Paul,
Format
35mm
@ic-racer I think the "w" speed method has been found to be a reasonable approximation of the 0.3G criterion method, so I don't think you need to worry about being able to compute film speed by other methods, esp. if you're happy with the results. I can sympathize with the software issue. You can probably add MATLAB to that list. Fortunately, these days open-source alternatives can be just as good.

Others will probably disagree, but to me, the fractional gradient criterion for estimating film speed is a bit of a moving target, in more ways than one. Early one, there was a lot of debate concerning the procedure for finding the gradient (and its value), including instruments that resemble Medieval torture devices, but which, no doubt, were very ingenious and adequately accurate at the time.
martens_photometer.png

radiometer.png

In addition to the problem of instrumentation, theoretical problems with the various fractional gradient proposals were also pointed out. I find this quote from Varden (1958) to sum up the problem really well: "In tying a criterion in with an average gradient over a definite exposure range it must be assumed that all films for which the criterion is to be used have the same general sensitometric characteristics." A lot of the films in use today have very different sensitometric characteristics to those used in the 1930s and 1940s to establish the original ASA standard. To me, it all comes down to the goodness of fit. That was the problem then, and the same problem exists today. With a good curve model, fractional gradient speeds have a greater chance of being good approximations of what Jones (and others) intended for the standard to represent. And then, there are other complicating factors such as the contribution of flare light, safety factors, format of film, incompatibility with exposure meter calibration, etc.

While many people disagree with the current fixed density model, it is a lot more unambiguous than the earlier standards. Whether or not it approximates true film speed (if there is such a thing) well enough is a matter of debate.

Here's a plot of a curve from an Ilford Delta 100 family, showing the computationally determined 0.3G speed point (including the tangent line as per Jones's definition) and the corresponding film speed speed (0.3G EI = 116) and the fixed density (0.1 over B+F) speed point with its corresponding speed (EFS = 118). The option for calculating the 0.3G speed I used here is the one recommended by Nelson (1960. eq 17). However, the original speed value was expressed as the reciprocal of exposure in meter-candle-seconds (here, it would be EI=176), and the exposure meter-corrected value would be estimated at around EI=44 (this includes the older safety factor of around 4). Yes, this is very convoluted, but that is, at least, my understanding of how the fractional gradient speed was used by photographers before the current ISO standard was established.

delta100_id68Plots.pdf03GTangentLine.png
 
Last edited:

aparat

Member
Joined
Sep 5, 2007
Messages
1,177
Location
Saint Paul,
Format
35mm
I want to follow up on my previous post regarding the original Jones fractional gradient model, 0.3G, and the w and Delta-X models. At the time Jones (and others) was working on establishing a model for determining film speed (and other parameters), the curve model was that of a toe and linear straight-line portion, primarily. The shoulder was often disregarded for the purposes of these calculations. Here's a typical curve and its corresponding fractional gradient parameters:
fractionalGradientModelJones.png

It seems that the films used in the 1940s could be adequately modeled by this type of characteristic curve. However, Varden (1958) argues that modern films (for the late 1950s) were quite different: "they have much steeper gradients in the low exposure range and flatter gradients in the high exposure range in comparison with previously available materials." They also "now increase in effective speed rather rapidly with increases in development, without becoming excessively steep in contrast" and have a different tolerance to under and overexposure. It's quite possible that modern films are still different from those described by Varden. Indeed, in my own tests of the currently available B&W films, I have found a great variety of characteristic curve shapes. Simply put, the characteristic curves of modern films cannot be adequately modeled by the methods applied at the time when the ASA standards were being forged.

For example, Bayer, Simonds, and Williams (1961) were aware of this fact and tried to model the characteristic curve by means of a logistic growth curve. They offered a range of examples showing the adequacy of the model. Here's what the curve looks like:

logGrowthModelBayerSimondsWilliams.png


The most important feature of their approach is the assumption that the characteristic curve cannot be properly modeled by a linear model due to the non-linear relationships that exist at the ends of the the curve. Even though their approach does not fit most modern curves well enough, it was, in my opinion, a step in the right direction. Here's a plot of a synthetic curve (the black dots) modeled by the Bayer, Simonds, and Williams method (the blue curve):
logGrowthModel.png


However, their general approach can be built upon and a more sophisticated logistic model, such as the Self-Starting Nls Four-Parameter Logistic Model (which evaluates the four-parameter logistic function and its gradient.), can be applied more successfully. Here's the same synthetic curve showing a much better fit:
ssfplSCurve.png


However, there's an alternative model that should, in theory, provide a good fit over a broad range of characteristic curves. The Generalized Additive Models (GAM) seems to be a good candidate for the film curve because it is designed allow the (generalized) linear model to "learn" nonlinear relationships, such as those that exist at both ends of the characteristic curve. Here's an example of the fit:
gamModelCurve.png



With a good model of the characteristic curve, the calculation of parameters that describe the curve, such as Gamma, Average Gradient, Contrast Index, Fractional Gradient, etc., can be computed more accurately by numerical, rather than graphical methods. That was the purpose behind the work reported by Bayer, Simonds, and Williams, who phrased the sentiment rather well:
"Electronic computing machines offer to the sensitometrist not only speed, precision, and economy, but also new freedom in choosing parameters for describing sensitometric effects. Many properties of primary interest which are not easy to measure by graphical methods can be readily specified for digital analysis, and their amounts can be uniquely determined even though several properties vary simultaneously." It appears, in 1958, Kodak engineers still used fountain pens and paper for the purpose, but it's clear that the field was quickly moving toward automated, computer-aided film analysis.

kodakDensitometerReading.png


Having a good model of the characteristic curve is useful not only for film analysis, but also for synthesis. Here's a family of curves that have been synthesized to posses specific properties, such as the amount of exposure, film speed, shadow compression, the length of the toe, Average Gradient, etc. Curve synthesis can be a very helpful tool for learning and modeling a film's response to exposure and development.

syntheticShort.png
 
Last edited:

ic-racer

Member
Joined
Feb 25, 2007
Messages
16,542
Location
USA
Format
Multi Format
Having a good model of the characteristic curve is useful not only for film analysis
Last year In a rant of frustration I questioned any mathematical model of H&D curves because of my inability to solve for unknown values. I posted the graph below I made in a spreadsheet. It was great the basic functions of the spreadsheet gave an equation/curve-fit for the H&D curve, but the spreadsheet did not have the power to solve it for unknown values of "X." For example what is the mathematical solution for X when Y = 20? Especially frustrating because one can just look at the graph and guess "6."

However, I came across this cool website The website gave the answer which I thought was mind boggling.

So I take back what I posted last year about the lack of utility when curve-fitting H&D datapoints!


Screen Shot 2023-04-07 at 5.11.08 PM.png







screen-shot-2022-01-31-at-4-08-35-pm-png.297146
 
Last edited:
OP
OP
Stephen Benskin
Joined
Jan 7, 2005
Messages
2,612
Location
Los Angeles
Format
4x5 Format
While many people disagree with the current fixed density model, it is a lot more unambiguous than the earlier standards. Whether or not it approximates true film speed (if there is such a thing) well enough is a matter of debate.

In my opinion, true film speeds would effectively be print judgement speeds which, for the most part, are determinant as they are based from the end results of the process. The current ISO fixed density speed is less ambiguous, which is one reason why it was chosen. The other is because it approximates the fractional gradient speed by incorporating the Delta-X criterion. If it didn't, the results wouldn't have the same relevance and the standard probably wouldn't have been adopted. When any fixed density method moves outside of the ISO parameters is when it becomes problematic unless it is used along with the Delta-X criterion. Most of the methods outlined online and in general photography books produce results that come relatively close to those obtained with Delta-X and in the ISO standard because under the normal conditions the parameters are similar. Apart from that, their methodology and accuracy is dubious.

Maybe you should have included the graph showing the spread of the 0.10 fixed density method vs print judgement speeds with a variety of films and processing conditions.

spread function for Delta X.jpg


I never spent any time on W speed because it was never adopted making it more of a historical footnote. Sorry Dale.

I'm not familiar with Varden so I don't know what his concerns were. Could you supply the publication information? Nelson was working on Delta-X at the same time as your Varden reference. Nelson was sure to be aware of these issues and sure to be aware of modern films and lenses.

One of the aspects that has always concerned me is the criterion the judges used to determine print quality. The papers from first excellent testing don't really specify, but Jones talks about it in The Psychophysical Evaluation of the Quality of Photographic Reproductions, PSA Journal, Vol 17, Dec 1951. "In this discussion, the term photographic quality will be used in referring to the degree of perfection with which the photographic picture reproduces in the mind of the observer the subjective impression which he received when looking at the original."

Were similar instructions given to the judges in the first excellent print tests? Statistically, most people use photography as a recording media. In a normal distribution curve of how people use photography, those with more artistic intentions probably fall somewhere outside of the central area. It makes sense to cater to the greater population. What would your mom want to see when she got the photos back from the drug store? What if the intention wasn't the a realistic production of how something looks but an interpretation that departs from a realistic impression of the subject or for lack of a better term, an artistic interpretation? A situation where the emotional element supersedes a strictly technically reproduction. How would this affect Jones' print judgement speeds and consequently sensitometric speeds? Or looking at it from a different perspective, how would film speeds based on Jones' print judgement speeds be applicable to the artistic work of someone like Bill Brant, Brett Weston, or Michael Kenna?

Perhaps, by having print judgement speeds based on a common and straight forward interpretation of reproducing a scene as close to the impression of the actual scene and determining a methodology that identifies a limiting criteria that would be equally applicable for all film types, it would produce a knowable and solid reference point, where departures from that point can branch off and still have commonality.
 
Last edited:

aparat

Member
Joined
Sep 5, 2007
Messages
1,177
Location
Saint Paul,
Format
35mm
Last year In a rant of frustration I questioned any mathematical model of H&D curves because of my inability to solve for unknown values. I posted the graph below I made in a spreadsheet. It was great the basic functions of the spreadsheet gave an equation/curve-fit for the H&D curve, but the spreadsheet did not have the power to solve it for unknown values of "X." For example what is the mathematical solution for X when Y = 20? Especially frustrating because one can just look at the graph and guess "6."

However, I came across this cool website The website gave the answer which I thought was mind boggling.

So I take back what I posted last year about the lack of utility when curve-fitting H&D datapoints!
Very cool! I am glad your rant worked. You found an ingenious solution to the problem. That's awesome! I think that polynomial interpolation, like the Lagrangian Polynomial Interpolation is a good way of finding intermediate values of y for a given x (or vice versa). I think it can be implemented in a spreadsheet or you can find an online solver for it, like the one you linked.
 

aparat

Member
Joined
Sep 5, 2007
Messages
1,177
Location
Saint Paul,
Format
35mm
In my opinion, true film speeds would effectively be print judgement speeds which, for the most part, are determinant as they are based from the results of the process. The current ISO fixed density speed is less ambiguous, which is one reason why it was chosen. The other is because it approximates the fractional gradient speed by incorporating the Delta-X criterion. If it didn't, the results wouldn't have the same relevance and the standard probably wouldn't have been adopted. When any fixed density method moves outside of the ISO parameters is when it becomes problematic unless it is used along with the Delta-X criterion. Most of the methods outlined online and in general photography books produce results that come relatively close to those obtained with Delta-X and in the ISO standard because under the normal conditions the parameters are similar. Apart from that, their methodology and accuracy is dubious.

Maybe you should have included the graph showing the spread of the 0.10 fixed density method vs print judgement speeds with a variety of films and processing conditions.

I never spent any time on W speed because it was never adopted making it more of a historical footnote. Sorry Dale.

I'm not familiar with Varden so I don't know what his concerns were. Could you supply the publication information? Nelson was working on Delta-X at the same time as your Varden reference. Nelson was sure to be aware of these issues and sure to be aware of modern films and lenses.

One of the aspects that has always concerned me is the criterion the judges used to determine print quality. The papers from first excellent testing don't really specify, but Jones talks about it in The Psychophysical Evaluation of the Quality of Photographic Reproductions, PSA Journal, Vol 17, Dec 1951. "In this discussion, the term photographic quality will be used in referring to the degree of perfection with which the photographic picture reproduces in the mind of the observer the subjective impression which he received when looking at the original."

Were similar instructions given to the judges in the first excellent print tests? Statistically, most people use photography as a recording median. In a normal distribution curve of how people use photography, those with more artistic intentions probably fall somewhere outside of the center area. It makes sense to cater to the greater population. What would your mom want to see when she got the photos back from the drug store. But what if the intention wasn't the a realistic production of how something looks but an interpretation that departs from a realistic impression of the subject or for lack of a better term, an artistic interpretation? A situation where the emotional element supersedes a strictly technically reproduction. How would this affect Jones' print judgement speeds and consequently sensitometric speeds? Or looking at it from a different perspective, how would film speeds based on Jones' print judgement speeds be applicable to the artistic work of Bill Brant, Brett Weston, or Michael Kenna?

Perhaps, if print judgement speeds are based on a common and straight forward interpretation of reproducing a scene as close to the impression of the actual scene and determining a methodology that identifies a limiting criteria that would be equally applicable for all film types and would produce a known solid reference point, departures can branch off and still have a commonality.

I agree that, ultimately, all important roads lead back to Jones's studies of print quality judgments. I still find it mind-boggling how he could have conceived of such a study, carry it out, and present a novel theory with lasting consequences and influence as a result. I also enjoy reading his prose.

Yes, the w speed is a bit of a curiosity but I thought it would be interesting to see how it worked.

You raise a very interesting point with the artistic aspect of print quality. There's the term that photographers often use, "the fine print," which probably means different things to different people. I enjoy reading people's comments and critiques of photographs posted on Photrio and elsewhere because they can tell you a lot about how people perceive photographs, including print (or scan) quality. Eavesdropping at an art gallery used to be something I'd do from time to time, for the same reasons. Now I think it's a bit creepy.

The Varden paper was published in Photo Methods for Industry, 1958, Vol. 1, pp. 39-41, cont. on p. 67. I believe this was a kind of trade journal, but it did contain serious papers, as well as interesting trade insights. For example, here's a survey I found regarding people's personal EI. It lists some unique films, but it also seems to imply the older, larger safety factor. It's informal research, but it's interesting, nonetheless.

ieRatings.png
 
OP
OP
Stephen Benskin
Joined
Jan 7, 2005
Messages
2,612
Location
Los Angeles
Format
4x5 Format
The Varden paper was published in Photo Methods for Industry, 1958, Vol. 1, pp. 39-41, cont. on p. 67.

Found it, but haven't read it yet although I can guess what the argument is. I was looking for this paper years ago, but was blocked because the LA library had a fire and much of it's archives were damaged and lost. Apparently this journal was one of them. The reason I was looking for it is because this article was referenced in the Nelson paper Safety Factors in Camera Exposure, the paper proposing the new (1960) ASA speed standard . ASA Exposure Index: Dangerously Safe, Lloyd E. Varden, PMI (Photo Methods for Industry), 1: 39 (May 1958). Found it Here Note the use of ASA Exposure Index and not ASA speed.

Nelson, "A number of articles in photographic magazines have pointed out the penalties and disadvantages resulting from the use of too large a safety factor and have urged that a smaller safety factor be introduced by means of a revision in the American Standard for determining ASA exposure indexes for black-and-white negative films. The general spirit of these articles is illustrated by the following title of one of them: "ASA Exposure Index: Dangerously Safe."" Zone System practioners, who feel the current ISO under exposes film, can blame articles like this for the change. 🙂 I should also add, Zone System speeds tend to agree with the ASA Exposure Indexes that the article calls Dangerously Safe.

So the chart has the ASA daylight speeds and the PMI reader's EI speeds which tend to be 1 stop faster. Now, ZS people and those influenced by their talk about "personal film speeds" say to half the ISO speed setting. Irony?

Finished it and I guessed right. Better than average popular photography article. Good with most of the background. Some of his reasoning and conclusions are questionable. As a precursor to the first excellent print series of papers, Jones wrote a paper outlining the arguments for a minimum useful gradient. He addresses Luther (later also addressing some of Luther's arguments in the course of the series) as well as updates of the 1928 paper on the same topic that Varden references. Jones, LA & Russell, ME, Minimum Useful Gradient as a Criterion of Photographic Speed, JOSA, Vol 25, Dec 1935. It's been too many years since I've read it. I've always regarded it as the hypothesis for the justification of the first excellent print testing.
 
Last edited:

ic-racer

Member
Joined
Feb 25, 2007
Messages
16,542
Location
USA
Format
Multi Format
I see the origin of "W speed" and "Delta X" as clear "templates" to place over the hand-drawn H&D curve to determine the speed.

I also see them as each being based on a very easy to determine landmark combined with a mathematical 'fudge factor' to align with 0.3G.

For example I see Delta-X based on the Y-Value of 0.1 above film base, with a calculated fudge factor (the DeltaX ) to align with 0.3G.

Similarly I see W-Speed based on the X-value of Inertia (X intercept) with a calculated fudge factor (the W) to align with 0.3G.

From a simple computing standpoint, the X-value (inertia, X intercept) is much easier to determine as just about every simple math software since the 1980s has a linear regression function.

However, as I just discovered today, there are now convenient tools on the internet to obtain the Y-value of 0.1 from a complex equation. The value Delta-D, also could be easily determined from the quadratic equation too.

The points above are 'simple computing' with tools that come installed on a home computer.

Nothing compared to Aparat's computer analysis which has advanced to the point I'm expecting him to do a mathematical analysis of the middle "S" portion of the curve to get a more robust "G" with which to make the 0.3G.
 

aparat

Member
Joined
Sep 5, 2007
Messages
1,177
Location
Saint Paul,
Format
35mm
@Stephen Benskin I agree with your interpretation of the Varden paper. My feeling was that some industry insiders were pushing for the lowering of the safety factor. We can only speculate on their motivations, beyond the sensitometric case they try to make. If photographers in the 1940s and 1950s were anything like today, the change may have been enticing to both manufacturers and consumers. And, yes, the irony struck me, too. It's strange (and fascinating) how these trends have changed over time.

I find some of these trade journal articles insightful because they provide some of the more applied side of sensitometry and photography, in general. The lack of peer review means we have to take their claims with a grain of salt, but that's okay, in my book, at least for some of the claims.

@ic-racer I think I can agree with your general take on the w speed and Delta-X speed calculations from the Nelson and Simonds paper. They can be a bit of a moving target. The simple least squares model for estimating the Delta-X and w speed equations would probably be questioned by a discerning statistician today, but, for most of us, I think they are good enough approximations. I hope to one day run a script that would go through all of my film data and compare the different film speed models, but I need to find the time to actually do it.

By the way, the numerical method of calculating the Jones 0.3G speed needs to account for the entire curve in order to find the local (fractional) gradients along its trajectory, including any of the non-linearities in the mid-tones. It can always be improved, no doubt. My goal with this software is to ultimately offer users a few different methods for most of the computed parameters, so as not to make it too opinionated. I think the otherwise excellent Win Plotter app could have been much better received if it had some of those options available. The way it is now, you kind of have to follow the BTZS system, or, at the very least, the Zone System, to get most benefit out of it.

Speaking of the Jones fractional gradient method, the notion of sensitometer light quality comes up a few times. In the Jones and Russell (1935) paper that @Stephen Benskin mentioned above, the sensitometer light quality is described as:
"The radiation used in making the sensitometric exposure shall be identical in spectral composition to that specified by the Seventh International Congress of Photography in defining the photographic unit of intensity for the sensitometry of negative materials." (p. 410)
I must admit, I have no idea what that means because I failed in obtaining the source, listed as "Proc. VIIth Int. Cong. Phot. (1928), p. 173." but Jones brings up the idea again in the 1949 paper as

"The quality (spectral composition) which has been adopted by international agreement for sensitometric measurements on negative materials is that emitted by a specified tungsten lamp-filter combination that corresponds to the quality of mean noon sunlight in Washington, D.C." (p. 132).

I have come across a couple of mentions of the fact that such a light source is not as actinic as actual daylight, and that a factor of 1.3 should be considered to make sensitometer-derived data more compatible with real-world applications. It's a pretty obscure detail, but I thought I'd bring it up to see what you guys think of this. My sensitometer has a white (as white as could be currently obtained) and green LED light source. My DIY device uses an incandescent bulb with an 80A filter. Neither is ideal, but I wonder how much of an error they introduce compared to the devices used in the 20th. century sensitometry research.
 

ic-racer

Member
Joined
Feb 25, 2007
Messages
16,542
Location
USA
Format
Multi Format
My sensitometer has a white (as white as could be currently obtained) and green LED light source.

Excuse me for getting threads confused, but didn't you post you had an EG&G that you sent to get a calibration certificate? Or was that someone else?

I never even attempted any ISO test in my darkroom. Without the EXACT light source specified by ISO, I'd not rely on any result.

I did perform an extensive test of sensitometer light with respect to home-made control strips which have nothing to do with ISO testing.

Conclusion was that for common films (no oddball spectrum) the light source did not matter.

That is, exposure and development deviations were appropriately accounted for in the control strip irrespective of the light used to make the control strip.

So, in a typical B&W darkroom, one is not required to purchase the pre-exposed strips (assuming one could even find them) to have exceptional control of their system.

So, in my darkroom, Ilford Delta 100 is defined as ISO 100 and the Wejex incandescent sensitometer (with 80A in its filter drawer) is my "standard" sensitometer for development and exposure control. I could have based my personal standard on my Xenon EG&G as Bill and Steve, but my exposures in the field are more like 1 second and the Wejex sensitometer uses a 1 second exposure.



Wejex 80a.jpg
 
Last edited:
OP
OP
Stephen Benskin
Joined
Jan 7, 2005
Messages
2,612
Location
Los Angeles
Format
4x5 Format
Excuse me for getting threads confused, but didn't you post you had an EG&G that you sent to get a calibration certificate? Or was that someone else?

I never even attempted any ISO test in my darkroom. Without the EXACT light source specified by ISO, I'd not rely on any result.

I did perform an extensive test of sensitometer light with respect to home-made control strips which have nothing to do with ISO testing.

Conclusion was that for common films (no oddball spectrum) the light source did not matter.

That is, exposure and development deviations were appropriately accounted for in the control strip irrespective of the light used to make the control strip.

So, in a typical B&W darkroom, one is not required to purchase the pre-exposed strips (assuming one could even find them) to have exceptional control of their system.

So, in my darkroom, Ilford Delta 100 is defined as ISO 100 and the Wejex incandescent sensitometer (with 80A in its filter drawer) is my "standard" sensitometer for development and exposure control. I could have based my personal standard on my Xenon EG&G as Bill and Steve, but my exposures in the field are more like 1 second and the Wejex sensitometer uses a 1 second exposure.



View attachment 335218

You are probably referring to me. I had a number of EG&G sensitometers calibrated in the day. Sent them to an EG&G department in MA and they tested the exposures, parts, replaced the electrical cord, and supplied me with a new set of attenuators. All for about $400 when I first started calibrating them and around $1300 around when I stopped. A few years ago, I attempted to contact them and they are no longer there. Probably another victim of digital. You might have better luck.

1680970242440.png
 
Last edited:

ic-racer

Member
Joined
Feb 25, 2007
Messages
16,542
Location
USA
Format
Multi Format
You are probably referring to me. I had a number of EG&G sensitometers calibrated in the day. Sent them to an EG&G department in MA and they tested the exposures, parts, replaced the electrical cord, and supplied me with a new set of attenuators. All for about $400 when I first started calibrating them and around $1300 around when I stopped. A few years ago, I attempted to contact them and they are no longer there. Probably another victim of digital. You might have better luck.

How are they holding up. I had to replace the Xenon tube in mine with one of slightly different construction. It works perfect now, but, has no traceable standard. I try to indicate a "relative" value for the X-axis when I label my H&D curves.

EG&G Light Chamber 2.jpeg
 
OP
OP
Stephen Benskin
Joined
Jan 7, 2005
Messages
2,612
Location
Los Angeles
Format
4x5 Format
@Stephen Benskin I agree with your interpretation of the Varden paper. My feeling was that some industry insiders were pushing for the lowering of the safety factor. We can only speculate on their motivations, beyond the sensitometric case they try to make. If photographers in the 1940s and 1950s were anything like today, the change may have been enticing to both manufacturers and consumers. And, yes, the irony struck me, too. It's strange (and fascinating) how these trends have changed over time.

I find some of these trade journal articles insightful because they provide some of the more applied side of sensitometry and photography, in general. The lack of peer review means we have to take their claims with a grain of salt, but that's okay, in my book, at least for some of the claims.

@ic-racer I think I can agree with your general take on the w speed and Delta-X speed calculations from the Nelson and Simonds paper. They can be a bit of a moving target. The simple least squares model for estimating the Delta-X and w speed equations would probably be questioned by a discerning statistician today, but, for most of us, I think they are good enough approximations. I hope to one day run a script that would go through all of my film data and compare the different film speed models, but I need to find the time to actually do it.

By the way, the numerical method of calculating the Jones 0.3G speed needs to account for the entire curve in order to find the local (fractional) gradients along its trajectory, including any of the non-linearities in the mid-tones. It can always be improved, no doubt. My goal with this software is to ultimately offer users a few different methods for most of the computed parameters, so as not to make it too opinionated. I think the otherwise excellent Win Plotter app could have been much better received if it had some of those options available. The way it is now, you kind of have to follow the BTZS system, or, at the very least, the Zone System, to get most benefit out of it.

Speaking of the Jones fractional gradient method, the notion of sensitometer light quality comes up a few times. In the Jones and Russell (1935) paper that @Stephen Benskin mentioned above, the sensitometer light quality is described as:
"The radiation used in making the sensitometric exposure shall be identical in spectral composition to that specified by the Seventh International Congress of Photography in defining the photographic unit of intensity for the sensitometry of negative materials." (p. 410)
I must admit, I have no idea what that means because I failed in obtaining the source, listed as "Proc. VIIth Int. Cong. Phot. (1928), p. 173." but Jones brings up the idea again in the 1949 paper as

"The quality (spectral composition) which has been adopted by international agreement for sensitometric measurements on negative materials is that emitted by a specified tungsten lamp-filter combination that corresponds to the quality of mean noon sunlight in Washington, D.C." (p. 132).

I have come across a couple of mentions of the fact that such a light source is not as actinic as actual daylight, and that a factor of 1.3 should be considered to make sensitometer-derived data more compatible with real-world applications. It's a pretty obscure detail, but I thought I'd bring it up to see what you guys think of this. My sensitometer has a white (as white as could be currently obtained) and green LED light source. My DIY device uses an incandescent bulb with an 80A filter. Neither is ideal, but I wonder how much of an error they introduce compared to the devices used in the 20th. century sensitometry research.

Varden's overall conclusion was correct. The large safety factor was a remnant. Other of Varden's conclusions tended to be overstated or one sided. By the late 1950s exposure meters were more accurate, many people in the 40s didn't have exposure meters and film processing was more controlled, so there was no reason to have such a large safety factor. Larger format films can tolerate a fair amount of overexposure without a loss in quality, but quality loss is more apparent with smaller format films, which were becoming prevalent.

The actual fractional gradient method isn't used any more, so there's no reason to updated it; however, if if was, the range to determine the average gradient would probably have been changed to account for the reduced flare from coated lenses. The method may no longer be used, but the fractional gradient point is still relevant. Delta-X has replaced the method and Nelson showed that is has even better agreement with print judgement speeds. Indication of 0.3G is still valid using Delta-X as the determinant, in my opinion. The Δ1.30 log-H of the ISO contrast parameters / Delta-X method isn't intended to work exactly the same way as the Δ1.50 range of the fractional gradient method in that it represents the average gradient of the overall response of the film with the average illuminance range. Nelson writes that the value of ΔD "is simply another way of expressing this average gradient."

In Safety Factors, Nelson writes the spectral quality for sensitometers had recently changed from Sunlight to Daylight, which incorporates more "skylight" blue. He didn't specify the new color temperature. The increased blue would require a change in the proposed constant in the speed equation from 0.95 to 0.80.
 
Last edited:

Bill Burk

Subscriber
Joined
Feb 9, 2010
Messages
9,290
Format
4x5 Format
Thanks for posting that article! I only remember the catch phrase “dangerously safe”.

My EG&G is doing fine. I adjust my graph template to agree with what I believe is its current calibration. When I do a TMY2 test that hits ASA parameters, Film Speed 400 on my top scale aligns with 0.10 density above base plus fog on that curve. If it doesn’t, I tweak the top scale for next time. I do not have to change graph paper very often because the EG&G is very consistent.
 

aparat

Member
Joined
Sep 5, 2007
Messages
1,177
Location
Saint Paul,
Format
35mm
I also have mine professionally calibrated. The certification is valid for a year, and then I send it out again. To me, it is worth it to have a reliable device of known parameters. I get a very detailed report, including meticulous measures of all the variables involved. Above and beyond what one would typically expect from such a service.

I calibrated my DIY device myself using equipment I rented, buy it is not a certified calibration. Still, it gives me decently accurate results. I use both devices frequently.

Apparently, companies such as 3M, as well as the pipeline and shipbuilding industries keep the demand for such calibration services strong.
 

ic-racer

Member
Joined
Feb 25, 2007
Messages
16,542
Location
USA
Format
Multi Format
Thanks for posting that article! I only remember the catch phrase “dangerously safe”.

My EG&G is doing fine. I adjust my graph template to agree with what I believe is its current calibration. When I do a TMY2 test that hits ASA parameters, Film Speed 400 on my top scale aligns with 0.10 density above base plus fog on that curve. If it doesn’t, I tweak the top scale for next time. I do not have to change graph paper very often because the EG&G is very consistent.

Yes, practical solution!
 

ic-racer

Member
Joined
Feb 25, 2007
Messages
16,542
Location
USA
Format
Multi Format
I also have mine professionally calibrated. The certification is valid for a year, and then I send it out again. To me, it is worth it to have a reliable device of known parameters. I get a very detailed report, including meticulous measures of all the variables involved. Above and beyond what one would typically expect from such a service.

I calibrated my DIY device myself using equipment I rented, buy it is not a certified calibration. Still, it gives me decently accurate results. I use both devices frequently.

Apparently, companies such as 3M, as well as the pipeline and shipbuilding industries keep the demand for such calibration services strong.

OK, that is it, I'm confusing in my mind ISO certification and calibration. You had the LED sensitometer calibrated, not an EG&G. But for an ISO test, the question is the calibration any good??

Screen Shot 2023-04-08 at 12.09.02 PM.png
 

Bill Burk

Subscriber
Joined
Feb 9, 2010
Messages
9,290
Format
4x5 Format
"The quality (spectral composition) which has been adopted by international agreement for sensitometric measurements on negative materials is that emitted by a specified tungsten lamp-filter combination that corresponds to the quality of mean noon sunlight in Washington, D.C." (p. 132).

I have come across a couple of mentions of the fact that such a light source is not as actinic as actual daylight, and that a factor of 1.3 should be considered to make sensitometer-derived data more compatible with real-world applications. It's a pretty obscure detail, but I thought I'd bring it up to see what you guys think of this.

I’ve been corresponding with a forum subcriber who is extremely consistent in test exposures to daylight/open shade, and I am looking for answers to why his results indicate the film is 2/3 stop more sensitive than its rated speed when he meets the ASA parameters.

I can explain 1/3 stop difference. But not the second third-stop difference. This could be it.
 

aparat

Member
Joined
Sep 5, 2007
Messages
1,177
Location
Saint Paul,
Format
35mm
OK, that is it, I'm confusing in my mind ISO certification and calibration. You had the LED sensitometer calibrated, not an EG&G. But for an ISO test, the question is the calibration any good??
Only to the extent that any calibrated device can be. There is always going to be some error. The certification is for the compliance not only for the device itself, including some of its components, but also for every instrument that was used for the calibration in compliance with relevant ISO, ANSI, IEC, NIST, and DIN standards. It's is a very long list. Frankly, it's overkill for my purposes, but it's good to have it. I'm one of the very few customers that are private individuals, and not big corporate clients.


Still, I always point out that my tests are not professionally done and that they should be treated as reasonable approximations only, for use by amateur photographers. I think that sometimes people claim absolute certainty in measurements, settings, methods, chemical formulae, etc. done in an amateur context. I prefer not to do that myself.
 

aparat

Member
Joined
Sep 5, 2007
Messages
1,177
Location
Saint Paul,
Format
35mm
I’ve been corresponding with a forum subcriber who is extremely consistent in test exposures to daylight/open shade, and I am looking for answers to why his results indicate the film is 2/3 stop more sensitive than its rated speed when he meets the ASA parameters.

I can explain 1/3 stop difference. But not the second third-stop difference. This could be it.

That might be the case, though daylight is highly variable. There's always going to be some trade-off involved, right?
 

Bill Burk

Subscriber
Joined
Feb 9, 2010
Messages
9,290
Format
4x5 Format
I also have mine professionally calibrated. The certification is valid for a year, and then I send it out again. To me, it is worth it to have a reliable device of known parameters.
It’s a bit of a fluke that I even have an EG&G at all (thanks to @Stephen Benskin for selling me such a great machine).

I am impressed that you get your equipment regularly calibrated. I can’t justify it.

I do a very low volume of work. (I mixed D-76 in June 2021 and May 2022, time for a new batch.)

But I do use it regularly. It only takes a few minutes to get it out, turn it on, place a piece of film on it and press the lid down (then put it away again).

I just got a bulk roll of Double-X 5222, so I will test time to reach ASA parameters in D-76 1:1, then develop a few rolls shot last weekend.

My backup DIY plan for a sensitometer (if something goes wrong with the EG&G that I can’t fix) is to cobble together an H&D sector wheel (pictured in Mees - I got it from Mark Osterman) with an old 8mm projector.
 
OP
OP
Stephen Benskin
Joined
Jan 7, 2005
Messages
2,612
Location
Los Angeles
Format
4x5 Format
Only to the extent that any calibrated device can be. There is always going to be some error. The certification is for the compliance not only for the device itself, including some of its components, but also for every instrument that was used for the calibration in compliance with relevant ISO, ANSI, IEC, NIST, and DIN standards. It's is a very long list. Frankly, it's overkill for my purposes, but it's good to have it. I'm one of the very few customers that are private individuals, and not big corporate clients.


Still, I always point out that my tests are not professionally done and that they should be treated as reasonable approximations only, for use by amateur photographers. I think that sometimes people claim absolute certainty in measurements, settings, methods, chemical formulae, etc. done in an amateur context. I prefer not to do that myself.

Mind sharing the contact information?
 
Last edited:
OP
OP
Stephen Benskin
Joined
Jan 7, 2005
Messages
2,612
Location
Los Angeles
Format
4x5 Format
It’s a bit of a fluke that I even have an EG&G at all (thanks to @Stephen Benskin for selling me such a great machine).

I am impressed that you get your equipment regularly calibrated. I can’t justify it.

I do a very low volume of work. (I mixed D-76 in June 2021 and May 2022, time for a new batch.)

But I do use it regularly. It only takes a few minutes to get it out, turn it on, place a piece of film on it and press the lid down (then put it away again).

I just got a bulk roll of Double-X 5222, so I will test time to reach ASA parameters in D-76 1:1, then develop a few rolls shot last weekend.

My backup DIY plan for a sensitometer (if something goes wrong with the EG&G that I can’t fix) is to cobble together an H&D sector wheel (pictured in Mees - I got it from Mark Osterman) with an old 8mm projector.

Bill, happy it found such a good home.
 
Photrio.com contains affiliate links to products. We may receive a commission for purchases made through these links.
To read our full affiliate disclosure statement please click Here.

PHOTRIO PARTNERS EQUALLY FUNDING OUR COMMUNITY:



Ilford ADOX Freestyle Photographic Stearman Press Weldon Color Lab Blue Moon Camera & Machine
Top Bottom