I was making a print from an infrared neg today of a tree with carvings that had a heavily overexposed background. First of all I made a test strip at f/8 in 4 sec increments and devved it the usual way in ilford multigrade. The ideal exposure for the tree was 10 secs which equates to 80 secs in the lith soup. I burned around the sides of the tree for 88 secs but this wasn't giving me enough detail. So I opened up the aperture to f/5.6 and tried another test strip to see how much exposure I needed to get the background right. It turned out it was 20 secs which makes it 160 secs in the lith. So I cut the exposure for the tree in half to 40 secs by opening up and burnt in the background for a further 120 secs giving me a total of 160 secs which should've been correct for the sky. Upon developing the print, the exposure of the tree matched the first print correctly but the background was more washed out than the previous print which was a total of 136 secs. Argh too many numbers here, confusing! What I'm really saying is, how come my highlights were non existant when I exposed it for 160 secs, 3 stops more than the 20 secs indicated by the test strip that showed full tones. I've heared of reciprocity failiure of paper but im sure ive used longer exposures than this before, plus the rest of the print developed fine. Hmm! Any explainations? I think I really could do with one of those pre flashing doofers :confused: