Calculating density range of negative from linear scan with ImageMagick

Mother and child

A
Mother and child

  • 2
  • 0
  • 435
Sonatas XII-55 (Life)

A
Sonatas XII-55 (Life)

  • 1
  • 1
  • 2K
Rain supreme

D
Rain supreme

  • 4
  • 0
  • 2K
Coffee Shop

Coffee Shop

  • 7
  • 1
  • 2K

Recent Classifieds

Forum statistics

Threads
199,820
Messages
2,797,137
Members
100,043
Latest member
Julian T
Recent bookmarks
0
Joined
May 3, 2020
Messages
282
Location
Washington, DC
Format
Large Format
There was a thread here a few months back about using SilverFast as a sort of poor man's densitometer, which linked to these very helpful pages:

How to make linear scans with SilverFast 8.8: https://www.sebastian-schlueter.com/blog/2017/2/10/how-to-make-a-linear-scan-with-silverfast-88
Using a scanner as densitometer: https://sites.google.com/site/negfix/scan_dens

Using these tutorials I now have high quality 16-bit linear HDR scans of my negatives, and know how to manually compute the density of a given area of the image.

However I'd like to take this a step further by creating an automated process to calculate (roughly?) the density range of a given linear scan of a negative, so that ideally I can take a pile of scans and easily identify which of them are the highest in contrast (to make the best subjects for alt-process printing which requires a DR of say >1.6).

I was a software engineer in a past life so I have the technical skills to pull this off using a toolkit like ImageMagick, but what I don't know is the best conceptual method for doing so. Here is what I have currently, in pseudo-code:

- Crop to 98% horizontal and vertical (to eliminate any film border I might've left in when cropping the prescan)
- Resize image to 1% (reduce to a manageable number of pixels)
- Calculate histogram of unique greyscale values (still in 16-bit, so essentially return the 1-65535 value of each pixel)
- Parse result, put in numeric order
- For the max and min values, take the log10 of (65535/X) to derive Dmax and Dmin, then take the difference as DR

This seems to "work" as far as it goes, and clearly lets me identify more-contrasty negs, but I have no idea how accurate my derived "density range" is compared to what a real densitometer would generate. The fuzziest part seems to be in deciding how big of an area to sample - is the average value of each 1% block of the image a reasonable approximation? I've also played around with using smaller sample sizes but then throwing out a few values on either end of the spectrum because they seemed to throw off calculations pretty badly.

I recognize it may not be possible to get an exact value but is there anything I could be doing better to get an approximation useful enough to determine suitability for different types of printing methods?
 

PhilBurton

Member
Joined
Oct 20, 2018
Messages
467
Location
Western USA
Format
35mm
There was a thread here a few months back about using SilverFast as a sort of poor man's densitometer, which linked to these very helpful pages:

How to make linear scans with SilverFast 8.8: https://www.sebastian-schlueter.com/blog/2017/2/10/how-to-make-a-linear-scan-with-silverfast-88
Using a scanner as densitometer: https://sites.google.com/site/negfix/scan_dens

Using these tutorials I now have high quality 16-bit linear HDR scans of my negatives, and know how to manually compute the density of a given area of the image.

However I'd like to take this a step further by creating an automated process to calculate (roughly?) the density range of a given linear scan of a negative, so that ideally I can take a pile of scans and easily identify which of them are the highest in contrast (to make the best subjects for alt-process printing which requires a DR of say >1.6).

I was a software engineer in a past life so I have the technical skills to pull this off using a toolkit like ImageMagick, but what I don't know is the best conceptual method for doing so. Here is what I have currently, in pseudo-code:

- Crop to 98% horizontal and vertical (to eliminate any film border I might've left in when cropping the prescan)
- Resize image to 1% (reduce to a manageable number of pixels)
- Calculate histogram of unique greyscale values (still in 16-bit, so essentially return the 1-65535 value of each pixel)
- Parse result, put in numeric order
- For the max and min values, take the log10 of (65535/X) to derive Dmax and Dmin, then take the difference as DR

This seems to "work" as far as it goes, and clearly lets me identify more-contrasty negs, but I have no idea how accurate my derived "density range" is compared to what a real densitometer would generate. The fuzziest part seems to be in deciding how big of an area to sample - is the average value of each 1% block of the image a reasonable approximation? I've also played around with using smaller sample sizes but then throwing out a few values on either end of the spectrum because they seemed to throw off calculations pretty badly.

I recognize it may not be possible to get an exact value but is there anything I could be doing better to get an approximation useful enough to determine suitability for different types of printing methods?
Sounds really good. I would offer to be a beta tester if I were further along in my own personal scanning project. (ask me in 3-4 months :redface: ) How about a further wrinkle. Modify the resulting file scan name or else export a CSV file with the file scan name and the min/max density, and DR.
 

138S

Member
Joined
Dec 4, 2019
Messages
1,776
Location
Pyrenees
Format
Large Format
I made a Freeware that I'm to release for that, it provides very accurate values, calibrated densities for the Stouffer's patches can be entered.

Scan the negative alongside the wedge, click on first and last patch (to shot the red rectangles) and then you point with the mouse to any spot and it says the density.

If you set the Lux·second on the exposure the enlarges is to perform (Lux on the easel without the negative multiplied by exposure time) then it says the exposure on paper for each spot in Lux·s and in H-Log units

It also makes the film and paper calibrations (curves), so it calculates the mask and adds the maask to the negative as if it was a new negative, calculating the paper image.

Instead refining a version to be released I advanced too much in new fectures, like proofing the tring with a color contrast mask, but I plan to stop development to make a distributable version with the basic features that are working very good yet.

Linear scans are not necessary as the non linearity also is played on the wedge, so it works the same, anyway I want to add TIFF 16 bits/channel support that it presently lacks to take full range scans.


If someone wants it I may send it to him, it would be good to have opinions, but he should provide an Anydesk or similar conection to tecah him how to use it



SP32-20200626-213001.jpg
 

jslabovitz

Member
Joined
Nov 27, 2007
Messages
63
Location
Shanghai, WV
Format
Medium Format
I used to do a lot of analysis like this when I was doing more Quadtone printing, and built up a little toolkit of tools and functions. It’s fun stuff!

Are you asking essentially how to verify that your process gives the same or similar results to a densitometer? Sorry for being obvious, but how about comparing with a known-good instrument? If I had a densitometer I’d volunteer to measure a print or two for you, and report back the values — then you could at least have something quantitive to evaluate. (I do have an Xrite i1 Pro spectrophotometer, and would be happy to measure something with it if it would help.)

Alternatively, as @Fraunhofer suggested, scan a step wedge with measured values, and compare that to the values you obtain from your process. You’ll end up with the differences in the form of a calibration curve (or LUT) that you could use both for verification and calibration (getting a near-linear result in the end would indicate success).

A few observations on your process:

- Be care about the resizing step. I’d be concerned about distortion of values. ImageMagick has many resizing functions, and they may not be appropriate for your goal of data normalization. You might look into doing a Gaussian blur, and then doing the histogram, without specifically resizing. (Avoid premature optimization!)

- Many traditional ICC/grayscale profiling targets have repeated patches, ranging from 4x to maybe 16x the number of values in the range you’re trying to hit. If you have 4x, say, you can easily throw out the two that are the most different, and still average the remainder. If the deviation is too much, then you may need to throw out the whole step sample — or the whole scan.

- You might not need this if you’re going to carefully observe the curves, but obviously each step value should be higher or lower than the one that comes before (i.e., monotonically increasing). If not — if there’s a peak or valley in the middle — the tonal scale isn’t useful. Sounds obvious, but I found this helpful as a verification step.

I’d love to see your tool if you care to share it!
 
Joined
Aug 29, 2017
Messages
9,710
Location
New Jersey formerly NYC
Format
Multi Format
Epsonscan also has a densitometer reading. I've never used it so I don't know how accurate and effective it is. Maybe someone else has and can give some details.
 

138S

Member
Joined
Dec 4, 2019
Messages
1,776
Location
Pyrenees
Format
Large Format
Epsonscan also has a densitometer reading. I've never used it so I don't know how accurate and effective it is. Maybe someone else has and can give some details.

This tool tells the R-G-B values of the scan, but those values are not the the sensitometric density in the negative. Anyway if you scan a density wedge with the film then you may locate what patch is close to the spot in the negative.

Silverfast also has a similar tool:

SP32-20200627-180419.jpg
 
Joined
Aug 29, 2017
Messages
9,710
Location
New Jersey formerly NYC
Format
Multi Format
This tool tells the R-G-B values of the scan, but those values are not the the sensitometric density in the negative. Anyway if you scan a density wedge with the film then you may locate what patch is close to the spot in the negative.

Silverfast also has a similar tool:

View attachment 249244
How would that tool help me in scanning?
 

138S

Member
Joined
Dec 4, 2019
Messages
1,776
Location
Pyrenees
Format
Large Format
How would that tool help me in scanning?


It your 16 bits/channel scan takes all the histogram and later you adjust curves in Photoshot, then it is not useful.

Instead, if you craft yor image directly in the scanner software, by adjusting clip levels and curve with it... then you may read the values of the pixels in your image to know if you are clipping highlight or shadow detail, for example.

Personally I do not use much those tools, as I work all images with Photoshot, it may bve useful with images in wjhat you not want to waste much effort on them, and you want just to obtain some decent final image from the scanner sorftware, not wnating to edit later with Ps, etc.
 
Joined
Aug 29, 2017
Messages
9,710
Location
New Jersey formerly NYC
Format
Multi Format
It your 16 bits/channel scan takes all the histogram and later you adjust curves in Photoshot, then it is not useful.

Instead, if you craft yor image directly in the scanner software, by adjusting clip levels and curve with it... then you may read the values of the pixels in your image to know if you are clipping highlight or shadow detail, for example.

Personally I do not use much those tools, as I work all images with Photoshot, it may bve useful with images in wjhat you not want to waste much effort on them, and you want just to obtain some decent final image from the scanner sorftware, not wnating to edit later with Ps, etc.
Doesn't checking the histogram accomplish the same thing? Why would you want to look at specific pixels when you have the histogram?
 

138S

Member
Joined
Dec 4, 2019
Messages
1,776
Location
Pyrenees
Format
Large Format
Doesn't checking the histogram accomplish the same thing? Why would you want to look at specific pixels when you have the histogram?

Not the same, with the "Densitometer tool" you check particular areas in the frame to see what reading you have there. May be you want to allow clipping in certain areas of the scene but not in others.
 
OP
OP
Joined
May 3, 2020
Messages
282
Location
Washington, DC
Format
Large Format
Are you asking essentially how to verify that your process gives the same or similar results to a densitometer? Sorry for being obvious, but how about comparing with a known-good instrument?

Yes, that's exactly the goal, and I should have made that clearer: the whole point here is to avoid shelling out for a densitometer if I can help it. :smile: But the suggestion to use a step wedge as a source of truth is well-taken, and that's a small investment I don't mind making for meaningful calibration. Thanks for the additional advice as well! I'll absolutely post the script on GitHub and link here once I'm confident in the results.
 
Joined
Aug 29, 2017
Messages
9,710
Location
New Jersey formerly NYC
Format
Multi Format
Not the same, with the "Densitometer tool" you check particular areas in the frame to see what reading you have there. May be you want to allow clipping in certain areas of the scene but not in others.
How do you allow clipping in specific areas?
 

138S

Member
Joined
Dec 4, 2019
Messages
1,776
Location
Pyrenees
Format
Large Format
How do you allow clipping in specific areas?

Well... you don't directly allow (or not ) clipping in certain areas, but you can check with the "Epson_Scan RGB densitometer tool" if the levels you set provocate a clipping in certain areas.

Of course, this is useful when you mostly adjust the final image in the scanning software, if you are to work the image in photoshop then you usually take all the histogram in 16 bits/channel and you clip nothing, you take all the dynamic range the negative has to keep all your options open for your future edition. Of course that image will be dull and very flat, and it will require a curve edition in Photoshop or Lightroom...


The dynamic range the negative has recorded from the scene can be well larger than then one a monitor or a paper can show, so to show mids with a suitable gradient (like in the scene) you may have to clip extreme shadows and highlights or you will have to compress extreme shadows and highlighs to take not too much range in the diplay medium or device: to allow the mids be contrasty enough.

You can do that job in the scanning... or you can take all histogram and later adjusting curves in the image edition software which is what we usually do for important images.
 

Lachlan Young

Member
Joined
Dec 2, 2005
Messages
4,974
Location
Glasgow
Format
Multi Format
Yes, that's exactly the goal, and I should have made that clearer: the whole point here is to avoid shelling out for a densitometer if I can help it. :smile: But the suggestion to use a step wedge as a source of truth is well-taken, and that's a small investment I don't mind making for meaningful calibration. Thanks for the additional advice as well! I'll absolutely post the script on GitHub and link here once I'm confident in the results.

A calibrated step wedge and using the mathematical means in the link you posted to convert density into LAB numbers seems a reasonable approach. And it should allow you to build a curve set for various times/ developers. However, it is important to remember that it isn't the be all and end all - while it'll show that HP5+ has a softer toe than TX400, it won't help you if you place your exposure too deep into the toe for adequate separation, but it might help you to understand what EI and processing times you should choose.
 
Photrio.com contains affiliate links to products. We may receive a commission for purchases made through these links.
To read our full affiliate disclosure statement please click Here.

PHOTRIO PARTNERS EQUALLY FUNDING OUR COMMUNITY:



Ilford ADOX Freestyle Photographic Stearman Press Weldon Color Lab Blue Moon Camera & Machine
Top Bottom