Every once in a while, I'll come across a posting on APUG or photo.net along the lines of.. [tired old film/developer combo thread omitted] "...and be sure to use a high accutance developer to maximize resolution" The posters act as if fine-grain developers contained all manner of evil silver solvents that happily munch away at your negative, turning your grain to mush and obliterating fine detail. I'm not a photo-chemist, but is it possible that the posters have got this backwards or are - at the very least - exaggerating? From the very little I have read about adjacency effects (e.g. so-called "border" and "fringe" effects) it would seem that these might actualy result in diminished resolution in the immediate area of the image in which they occur. Applying an exaggerated unsharp mask to an image in PhotoShop also seems to suggest this could be true. At the same time, I know that all the metallographs that were produced in my grad school lab work back in the mid-90s were developed using D-76c and I understand that this developer is very commonly-used in scientific work. Being from the D-76 family, I doubt it's considered a "high-accutance" developer...but I suspect that scientists researchers aren't keen on the loss of image resolution in their work. I have not seen any scientific findings on the web regarding the benefits of accutance developers for resolution. Does anybody any such findings that they can share?