The numbers are official Kodak guidance taken from Z-131. Yes, that's assuming no replenishment. @Mick Fagan there are substantial differences between Kodak's official recommendations, that tend to be conservative, and community wisdom. I have assembled only the manufacturer's data there. One area where I made an exception was "Chemical Storage Life" table.
I don't replenish the developer, and use a JOBO. For every 570ml, I process up to 4 rolls of 135 and 1 control strip in the 5 slot. My last run was less than 0.01 deviation (HD-LD) from the reference strip for the red and blue channels, and 0.04 in the green channel.
Thank you. Funny, but right now I am reading JOBO manuals and I have a question for JOBO users specifically regarding C41:
- Does it really keep the developer at 100F?
I looked at how it's constructed and I find it hard to believe that the developer doesn't lose some temperature as you pour it into the tank through the lift. Is the lift tunnel heated? I lose about 1-2F when I pour 100F developer from one pre-heated bottle to another, yet the JOBO instructions for C-41 advocate for keeping the water bath at 100F, as if nothing gets lost during pouring. True/false?
I don't replenish the developer, and use a JOBO. For every 570ml, I process up to 4 rolls of 135 and 1 control strip in the 5 slot. My last run was less than 0.01 deviation (HD-LD) from the reference strip for the red and blue channels, and 0.04 in the green channel.
I pre-heat my tank for 3:15, and temper my developer at 39-40.5C depending on how many rolls and ambient room temperature.
Adrian, I think this is a bit misleading. When I first read it, I'm thinking to myself, "how could this be?" Then I'm thinking, well, if the film is only lightly exposed. Then I get to thinking, I wonder how he's reading these, it it actually a Status M densitometer or is he deriving the numbers from something else? And then, I think, why does he give only the "contrast" (HD-LD in control strip nomenclature, meaning a high-density patch minus a low density patch), but not the LD value? So I'm real distrustful of the values, yet it IS what you're getting, so who can argue with that? Still, it goes counter to what I've spent a large part of my adult work life dealing with. But your next post largely answers this:
Since the process is actually spec'd at 37.8 C (100 F), this means that you're actually starting out your developer above this aim by 1.2 to 2.7 C (rough 2 to 5 F), which is really quite a lot. So the fact that you are raising temperature to more or less compensate for extra rolls helps explain why the "development," more specifically the contrast values, didn't fall off.
I think this needs to be pointed out because people who are not familiar with the temperature numbers may not catch this, and rather conclude that extended use of the developer has little effect.
But... it seems odd to me that Kodak would be so free about boosting the temperature this far. Since you referred to your source, the Z-131 manual, I took a quick look. I interpret this differently than you do.
On page 3-2 (this is section 3 of the Z manual) they have a "Developer Starting Temperature..." section. (This section is actually using a so-called "sink line" rather than a rotary processor.) What Kodak seems to be saying is to actually develop at the spec temp (37.8 C, 100.0 F), but... start out with a higher developer temperature to compensate for the cooling effect of lowering a rack of film into the developer.
The rotary processor section starts on page 3-6 where they have some comments about prewarming the tube (see footnotes for table 3-5).
On page 3-7, there is a brief discussion, see "Developer Temperature and Time..." From that section, "The best way to compensate for loss of developer temperature is by increasing the developer time (from the standard time of 3:15)." As far as I see, Kodak is not specifically recommending to increase developer temperature. I can't really make a qualified opinion on this, as my experience is almost exclusively with systems that are always kept in spec (aside from periodic screwups).
I'm not trying to be nitpicky about this, and wouldn't have bothered posting had you been clear from the start that you had elevated the developer temperature. But without that you have put me in an awkward position. I had just explained how sparse use of a color developer is likely to cause problems, and then you step in to essentially say, "well, I use those quantities of developer in a Jobo and (certain parts of) my control plots are right on the money." So if I don't respond it might appear that I am conceding the argument (which I would not do). My apologies if it appears I'm overdoing it, and best wishes with your system.
Adrian, I think this is a bit misleading. When I first read it, I'm thinking to myself, "how could this be?" Then I'm thinking, well, if the film is only lightly exposed. Then I get to thinking, I wonder how he's reading these, it it actually a Status M densitometer or is he deriving the numbers from something else? And then, I think, why does he give only the "contrast" (HD-LD in control strip nomenclature, meaning a high-density patch minus a low density patch), but not the LD value? So I'm real distrustful of the values, yet it IS what you're getting, so who can argue with that?
Here's the full reading from the last run of today:
Of course you can still use the data for "process control," seeing it as a statistical process, etc. But I wouldn't be so sure that it's giving you a clear picture of Kodak spec limits; I suspect that the differences are being minimized.
Thanks for the full set of example data.
Adrian, when you first posted the HD-LD it didn't look right to me, knowing that that was processed with a limited amount of developer. So from my point of view, having just estimated the Br (restrainer) levels for an "average" roll, I would not expect the plot to be so close to aim. (At 0, 4, 0, they're closer than most well-controlled processes.) With the bromide levels running 15-20% high (at the tail end of processing) the control strip plots ought to be appreciably lower. But this is possibly explained by having the elevated starting temperature; even though it falls back into spec after a minute there could still be a considerable effect.
But now, seeing all the numbers, it's pretty clear that they're not Status M. (I presume you're trying to derive them from scanner data, which cannot legitimately be done, at least not without prior knowledge about the spectral characteristics of everything.)
The things that immediately stick out are the very high numbers for Dmin (clear film). A typical Kodak pro color neg film would have densities of roughly: Red = 0.20, Green = 0.65, and Blue = 0.85. Whereas yours are all substantially higher, above 1.00. Now, it's certainly possible to have something go wrong with the process, producing a high base stain level. But getting those same numbers (above 1.00) on a Kodak reference strip, the one actual processed by Kodak, is near-certain evidence of incorrect readings.
Of course you can still use the data for "process control," seeing it as a statistical process, etc. But I wouldn't be so sure that it's giving you a clear picture of Kodak spec limits; I suspect that the differences are being minimized.
the readings are not nulled out, so the base fog is relative. That should be obvious given that the reference control strip readings (DMIN AIM) are also high. The readings that say AIM are readings from the supplied reference strip plus the correction factors supplied with that batch of control strips. Is it status M? No, it’s not. I’m working with what I’ve got. I’ve been on the hunt for a good color density meter for a while. All I can say is that the state of the film processing industry is in shambles and quality equipment is either not that abundant, or prohibitively expensive. So... I make do and work with what I’ve got. I can tell you that in an effort to validate my own process, I’ve sent very carefully constructed test rolls to a fair number of labs here in the US and can say with a pretty high level of certainty that a significant number of labs running today most certainly are not doing any monitoring of any kind as the results I got back varied wildly. I’m not going to name names as that was a snapshot in time and my intent is not to name and shame, but to just simply point out that the state of the industry is not what it was at its height.
all that aside, let’s get to the heart of the matter... I’m processing 4 rolls of film with 600ml of 1 shot developer. That gives me 150mL of developer per roll. Given that the replenishment rate per roll in a replenished system is quite a bit lower than that, I’m not sure what the source of heartburn is. Please enlighten me.
Be careful with extrapolating minimum solution volume from replenishment rates
all that aside, let’s get to the heart of the matter... I’m processing 4 rolls of film with 600ml of 1 shot developer. That gives me 150mL of developer per roll. Given that the replenishment rate per roll in a replenished system is quite a bit lower than that, I’m not sure what the source of heartburn is. Please enlighten me.
it’s easy enough to find an xrite 810 on eBay, though few if any come with the Calibration stuff which makes them not much better than what I’m already doing...,
It's mainly just an interesting puzzle to me, how your system is working. But like I said earlier you put me in an awkward position; since your earlier post seems contrary to what I had just said (~150 ml/"roll" should have a measurable reduction in developer activity), if I don't respond I may be seen as conceding the point. Now, since I have personal knowledge that C-41 developer is pretty sensitive to development byproducts I'm pretty confident in what I had posited, although it was sort of a "back of the envelope" sort of thing, and it's possible that I could have been completely off on my estimate.
Plus I was curious as to how you could possibly have control strips plotting so well, essentially right on the money. Anyway, I think that the elevated starting temperature explains a lot of this.
Well, the thing is, if you're already matching the reference control strip, not much need for this. I just suspect that the narrow spectral zone seen by the Status M response will exaggerate density variations a little more than a broader spectral zone. So you might get a little more jitter on the plots. But it's possible that your current reading system might not be hitting the right spectral areas.
Regarding calibration, there's a handful of workarounds. For film, for example, you "zero" the instrument with nothing in the gate, reading "air" as it's sometimes called. So you only need a higher density patch to set the "slope." The original check plaques look like like black & white film, so you could probably use that IF you knew the density values. You wouldn't be too far off using a step tablet from Stouffer (or however it's spelled). You could just presume that all colors get the same value; this'll to plenty close enough for most practical purposes.
no argument from me there, I totally get that. I also get the minimum required chemistry needed per roll, though 1 roll per 250 mL seems very conservative. I’ve not seen a large difference between 150mL per and 200mL per roll, and even 300mL per roll, though I have in the past ran 5 whole rolls in 600mL (And 6 rolls of 120) and when the developer comes out, it looks and smells pretty used. Backing off to 4 rolls per 600mL and that doesn’t really happen. I’d be OK with 3 rolls per 600mL, but much less than that and I might as well close up shop as my throughput wouldn’t be profitable.
I wouldn’t be surprised if 150mL per roll is right on the ragged edge of minimum developer for most emulsions, and potentially not enough if I had 4 rolls of 400/800 speed, or 4 rolls of heavily exposed film. I say this because I have noticed a very marked difference in appearance and smell of the used developer at the end of the run on the times I’ve done 5/6 rolls per batch.
in terms of the starting temperature, you’d be amazed how much heat you lose in a jobo through the simple act of taking the tank off the machine so you can pour in the developer, then pouring in the developer and putting the tank back on the machine. It sounds really high at first blush, but a lot of that heat is totally sucked out just by doing that alone. It’s even worse if you have a jobo with a lift system as none of the channels that you pour the chemistry through are heated, so you lose a huge amount of heat just getting the chemistry through the channels in the lift before it’s even entered the tank. In terms of temperature management, using a jobo is nothing like processing with other systems. The number one problem is making sure you have enough heat going in as you lose a lot of it just getting the chemistry into the tank.
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?