ProfessorC1983
Member
I could use some guidance on how/when to properly set gamma value when scanning B&W film, as I've managed to confuse myself a few different ways.
My goal here is to use digital scans solely as a "proofing" step for photos I eventually want to print using either VC silver gelatin, or alternative processes. So I don't care as much about "how to get the best digital image from this negative" so much as "how to get the best approximation of how the analog print of this neg will look with X process/paper" so I don't have to waste tons of time and paper doing trial-and-error printing.
Where I get flummoxed is that there are three different gamma values in play: the contrast index of the negative itself (defined by film/developer/time), the screen gamma (2.2, right?) and the exposure scale of the paper/process I eventually want to use to print.
What I've been doing so far is to scan to 2400dpi, 16-bit HDR RAW linear TIFF files in SilverFast 8, which seems to work as intended. The only adjustments I make in SilverFast are crop, rotate and flip. My understanding is that the HDR RAW format does not encode a specific gamma.
I then process the image in either GIMP or ImageMagick to take some "densitometer" readings of the raw scan before trimming both sides of the histogram, inverting, scaling down, and exporting to another file to visually proof.
But I'm never explicitly adjusting the gamma along the way, and I realize that I should probably be setting to 2.2 before manipulating the histogram in order to get the most "true" representation of the negative's actual contrast index on-screen... right?
And then, if I want to "see" some approximation of how the image will look when printed, should I apply the gamma adjustment for the paper/process-specific contrast after adjusting for the screen but before trimming Levels? Does it matter?
Or is this a fool's errand and I should just suck it up and start buying paper in bulk?
My goal here is to use digital scans solely as a "proofing" step for photos I eventually want to print using either VC silver gelatin, or alternative processes. So I don't care as much about "how to get the best digital image from this negative" so much as "how to get the best approximation of how the analog print of this neg will look with X process/paper" so I don't have to waste tons of time and paper doing trial-and-error printing.
Where I get flummoxed is that there are three different gamma values in play: the contrast index of the negative itself (defined by film/developer/time), the screen gamma (2.2, right?) and the exposure scale of the paper/process I eventually want to use to print.
What I've been doing so far is to scan to 2400dpi, 16-bit HDR RAW linear TIFF files in SilverFast 8, which seems to work as intended. The only adjustments I make in SilverFast are crop, rotate and flip. My understanding is that the HDR RAW format does not encode a specific gamma.
I then process the image in either GIMP or ImageMagick to take some "densitometer" readings of the raw scan before trimming both sides of the histogram, inverting, scaling down, and exporting to another file to visually proof.
But I'm never explicitly adjusting the gamma along the way, and I realize that I should probably be setting to 2.2 before manipulating the histogram in order to get the most "true" representation of the negative's actual contrast index on-screen... right?
And then, if I want to "see" some approximation of how the image will look when printed, should I apply the gamma adjustment for the paper/process-specific contrast after adjusting for the screen but before trimming Levels? Does it matter?
Or is this a fool's errand and I should just suck it up and start buying paper in bulk?
