As part of my color carbon endeavor, I was mucking about with some tests yesterday and I hit upon something I can't quite explain.
TL; DR: I'm not seeing much difference in the dmax I get with a print made at the same time series using one tissue with a low, and one with a high concentration sensitizer.
Here's what I did:
I took two identical tissues from the same batch. Around 5x7", 0.5% India ink, 8% gelatin, 0.25% or so glycerin, 3% sugar.
One tissue I sensitized with 0.5ml 16% ammonium dichromate and a few ml. ethanol. Sensitizer applied with a foam roller, tissue dried completely by running room temperature air over it in a drying box with a powerful fan.
Second tissue sensitized and dried using the same procedure, but 0.5ml 4% dichromate. So effectively only 25% of the dichromate load compared to the first tissue.
Then I exposed both tissues using the same contact printing frame, light source and negative. The negative is a digitally printed job onto screen printing film using an Epson 3880 using its black + yellow inks, with normal ink load (so no additional ink density). Each tissue I printed as a series of test strips, with one-stop increments. The 16% dichromate tissue got exposures of 30 seconds, 1 minute, 2 minutes, 4, 8, 16 and 32 minutes. The 4% tissue got exposures of 1 minute, 2, 4, 8, 16, 32 and 64 minutes. The strips were realized by incrementally covering the tissue with a piece of rubylith and then giving additional exposure.
The negative is a simple series of 10% density increments from 0% to 100% density fashioned in GIMP. No correction curve was applied to this file. Once more: both tissues were printed from the same negative. This is the digital version of the negative; the print onto transparency film is pretty faithful (you'll have to take my word for it; my old Epson actually had a good day when I printed it):
(The rubylith seems to do an OK job overall, but with long exposures (16 minutes and longer) there seems to be some fog due to halation in the contact printing glass and perhaps the tissue itself. While this may skew the results somewhat, I don't think it explains what I think I'm seeing.)
I then processed both tissues in the same way; transferred onto Yupo, dried, and then scanned together in one go. Which looks like this:
(click for larger)
Note the annotations; the second tissue is actually the top one. Sorry about that. I noted the number of minutes (or seconds) of each of the horizontal bands on the left.
Used GIMP to take Lab measurements and recorded the L (lightness) measurement on a scale of 0 to 100 of the darkest column on the left and the lightest (white) column on the right. So effectively I measured dmax and dmin of each of the strips on both prints. I tabulated the values for the 1 minute, 2, 4, 8, 16 and 32 minutes exposures:
Maybe the above is a bit abstract, but if you look at the numbers, you may already spot the source of my surprise. To make it easier, this is what it looks like in a simple line-connected scatter plot:
Number of minutes on the horizontal axis, L-value in % on vertical axis.
The orange set of lines is the 16% sensitizer tissue, the green set of lines are the 4% sensitizer tissue measurements. The top lines are low density (high values) and bottom set of lines are the high density (darkest values).
Alright, so the obvious observations are:
* Both tissues respond in a logarithmic fashion to increased exposure. No surprise.
* At around 16 minutes, the tissue starts to build density through the black parts of the negative. In other words, from there onward, the blocking power of the ink in the high density parts of the negative is no longer sufficient to make a 'paper white'. The only thing surprising about this is that it actually takes so much exposure to print through this pretty normal amount of ink. Not bad for cheap transparency film.
A much less logical but very observation is: there seems to be only a minor difference between the maximum density of the 4% and 16% tissues at the same exposure times, although there is a factor 4 difference in sensitizer strength. Huh? Yes, there's a difference. The 16% tissue makes slightly higher dmax for the same amount of exposure. It's a visible difference in real life, but you actually have to look pretty hard to spot it. In other words: it's quite minimal. That is, relative to my expectations.
Is this a result of messing with inkjet digital negatives, which turn out to behave more like halftone screens than actual continuous tone negatives? In which case I'd be surprised, because the maximum density strip in the negative is not a 100% perfectly even density either, and I don't see very apparent dot gain issues in that band - they should have been there if the system did behave like a true halftone screen setup.
Or is this due to the self-masking effect of dichromate in gelatin? That the 16% tissue is trying (in a way) to make more density, but the exposed dichromate itself inhibiting this?
So here's my question: how would you explain this minimal difference between these two sets of density measurements especially in terms of the highest density measurements for each exposure time? I would have expected that the 16% tissue would create far more density at the same exposure time than the 4% tissue. Turns out it doesn't...!?
In case you're wondering: yes, there is a difference in response curve, and that's pretty significant. Here is what it looks like:
For this, I measured the density of each vertical band in both prints for the 8 minute exposure. I then normalized these values; that is, I mapped the readings onto a 0-100% scale. The orange plot is the 4% tissue, the blue one is the 16% tissue. Note that you can no longer make an absolute comparison between the tissues based on this plot, but the difference in curve shape seems quite significant.
Well, blame on me for not having done more systematic testing before - or not paying sufficient attention when I did. But this is kind of surprising to me. I kind of feel rather stupid for not having spotted this before, I must say.
Am I going crazy? (I mean, with regards to this test; I perfectly well know I am in general, you see.)
TL; DR: I'm not seeing much difference in the dmax I get with a print made at the same time series using one tissue with a low, and one with a high concentration sensitizer.
Here's what I did:
I took two identical tissues from the same batch. Around 5x7", 0.5% India ink, 8% gelatin, 0.25% or so glycerin, 3% sugar.
One tissue I sensitized with 0.5ml 16% ammonium dichromate and a few ml. ethanol. Sensitizer applied with a foam roller, tissue dried completely by running room temperature air over it in a drying box with a powerful fan.
Second tissue sensitized and dried using the same procedure, but 0.5ml 4% dichromate. So effectively only 25% of the dichromate load compared to the first tissue.
Then I exposed both tissues using the same contact printing frame, light source and negative. The negative is a digitally printed job onto screen printing film using an Epson 3880 using its black + yellow inks, with normal ink load (so no additional ink density). Each tissue I printed as a series of test strips, with one-stop increments. The 16% dichromate tissue got exposures of 30 seconds, 1 minute, 2 minutes, 4, 8, 16 and 32 minutes. The 4% tissue got exposures of 1 minute, 2, 4, 8, 16, 32 and 64 minutes. The strips were realized by incrementally covering the tissue with a piece of rubylith and then giving additional exposure.
The negative is a simple series of 10% density increments from 0% to 100% density fashioned in GIMP. No correction curve was applied to this file. Once more: both tissues were printed from the same negative. This is the digital version of the negative; the print onto transparency film is pretty faithful (you'll have to take my word for it; my old Epson actually had a good day when I printed it):
(The rubylith seems to do an OK job overall, but with long exposures (16 minutes and longer) there seems to be some fog due to halation in the contact printing glass and perhaps the tissue itself. While this may skew the results somewhat, I don't think it explains what I think I'm seeing.)
I then processed both tissues in the same way; transferred onto Yupo, dried, and then scanned together in one go. Which looks like this:

(click for larger)
Note the annotations; the second tissue is actually the top one. Sorry about that. I noted the number of minutes (or seconds) of each of the horizontal bands on the left.
Used GIMP to take Lab measurements and recorded the L (lightness) measurement on a scale of 0 to 100 of the darkest column on the left and the lightest (white) column on the right. So effectively I measured dmax and dmin of each of the strips on both prints. I tabulated the values for the 1 minute, 2, 4, 8, 16 and 32 minutes exposures:
4% 0.5ml tissue; L value | 16% 0.5ml tissue | |||
Time (minutes) | Darkest value / dmax | Lightest value / dmin | Darkest value / dmax | Lightest value / dmin |
1 | 31.4 | 94.9 | 35.5 | 95.3 |
2 | 25.9 | 93.9 | 29.3 | 95.3 |
4 | 22.5 | 93.8 | 24.2 | 95.2 |
8 | 20.5 | 93.8 | 21.4 | 94.5 |
16 | 18.3 | 92.8 | 17.8 | 91.9 |
32 | 17.1 | 72.2 | 16.5 | 75.6 |
Number of minutes on the horizontal axis, L-value in % on vertical axis.
The orange set of lines is the 16% sensitizer tissue, the green set of lines are the 4% sensitizer tissue measurements. The top lines are low density (high values) and bottom set of lines are the high density (darkest values).
Alright, so the obvious observations are:
* Both tissues respond in a logarithmic fashion to increased exposure. No surprise.
* At around 16 minutes, the tissue starts to build density through the black parts of the negative. In other words, from there onward, the blocking power of the ink in the high density parts of the negative is no longer sufficient to make a 'paper white'. The only thing surprising about this is that it actually takes so much exposure to print through this pretty normal amount of ink. Not bad for cheap transparency film.
A much less logical but very observation is: there seems to be only a minor difference between the maximum density of the 4% and 16% tissues at the same exposure times, although there is a factor 4 difference in sensitizer strength. Huh? Yes, there's a difference. The 16% tissue makes slightly higher dmax for the same amount of exposure. It's a visible difference in real life, but you actually have to look pretty hard to spot it. In other words: it's quite minimal. That is, relative to my expectations.
Is this a result of messing with inkjet digital negatives, which turn out to behave more like halftone screens than actual continuous tone negatives? In which case I'd be surprised, because the maximum density strip in the negative is not a 100% perfectly even density either, and I don't see very apparent dot gain issues in that band - they should have been there if the system did behave like a true halftone screen setup.
Or is this due to the self-masking effect of dichromate in gelatin? That the 16% tissue is trying (in a way) to make more density, but the exposed dichromate itself inhibiting this?
So here's my question: how would you explain this minimal difference between these two sets of density measurements especially in terms of the highest density measurements for each exposure time? I would have expected that the 16% tissue would create far more density at the same exposure time than the 4% tissue. Turns out it doesn't...!?
In case you're wondering: yes, there is a difference in response curve, and that's pretty significant. Here is what it looks like:
For this, I measured the density of each vertical band in both prints for the 8 minute exposure. I then normalized these values; that is, I mapped the readings onto a 0-100% scale. The orange plot is the 4% tissue, the blue one is the 16% tissue. Note that you can no longer make an absolute comparison between the tissues based on this plot, but the difference in curve shape seems quite significant.
Well, blame on me for not having done more systematic testing before - or not paying sufficient attention when I did. But this is kind of surprising to me. I kind of feel rather stupid for not having spotted this before, I must say.
Am I going crazy? (I mean, with regards to this test; I perfectly well know I am in general, you see.)
Last edited: