Agreed; they often come in JPG or TIFF format scanned on Noritsu's, which is already lossy in detail, and 24-bit color depth only (16.7M colors)
Too bad they don't offer them in RAW format so you can do it yourself (maybe some do?)
Home 35mm scanners are capable of 48-bit color, even RAW. allowing more capability to adjust to your liking, just like your digital camera RAW files.
I would never trust a lab scan's color accuracy.
I don't think I've EVER printed color that mediocre! Blaah in a minilab voice. And how did the highlights get so miserably washed out?
Is it true when developing film there is some degradation by processing whereas it’s avoided from a SD card directly to your computer…?
This question is ambiguous and can not be shown to be "true" or "false." For instance, "processing" has two different meanings: digital transformations are not directly related to chemical reactions. The term "degradation" means different things in each domain, as well. (For instance, I could imagine that a chemist might describe development as "degrading" or "reducing" silver compounds, etc.) Continuing in this manner, "SD cards" are not equivalent to "celluloid films." Finally, the end product of a digital photograph is a collection of bits that persists only in the presence of associated computing machinery, whereas the end product of a film photograph is an etching on celluloid that persists as long as it is maintained in an acceptable environment---for instance, away from fire and other hazards.
Critical observation, thank you. I'd considered including some discussion of "process," but in the interest of brevity did not include your important observation in my post.IMHO the more important problem with the question is that "degradation by processing" implies (beside the explicit film-digital comparison) a comparison of un-processed and processed film. But un-processed film that remains unprocessed has no value to image making, the thing that's supposed to be degraded, I assume it's the image, does not exist until processed. It's a false premise.
IMHO the more important problem with the question is that "degradation by processing" implies (beside the explicit film-digital comparison) a comparison of un-processed and processed film. But un-processed film that remains unprocessed has no value to image making, the thing that's supposed to be degraded, I assume it's the image, does not exist until processed. It's a false premise.
Give me time to process that…!
It's not a photograph until the latent image on the film (which cannot be seen) is processed (to where it can be seen). Think about it.
With a 12 MP 35mm DSLR like the former Canon 5D or Nikon D700 you can make excellent 30x40cm prints.
And very good 40x60cm prints.
And still good 50x75cm prints.
6. My question is does film have any degradation from processing since it isn’t as realistic as the photos from the MD.
I agree that film doesn’t have the accuracy of color of a real scene but it will not stop me from shooting that format…!
When you "take" a digital photograph, at least in my limited understanding, a process of collecting data onto a substrate (the sensor) is performed and the camera's software processes these data points into a "raw image," which is augmented data that can then be processed (displayed) by visualization and editing software, such as Capture One, Lightroom, etc.
I suppose in some sense this arrangement of bits on a metal substrate, i.e., encoding image data onto the sensor, is similar to creating a "latent image" on film? Couldn't say because I'm not a chemist nor am I familiar with demosiacing algorithms.
As far as your question about the film negative, you could certainly expose film in a totally dark room, process that film and expect to see an image of a "dark room." And putting any photograph in a dark box does not change its property of "being a photograph." it's just a photograph that you put in a dark box.
Any photograph (digital or analog) of something in the world is counterfactual: meaning that if the thing photographed was "different" in some way, then its photographic image would be different in the same way. The same is NOT necessarily true of a painting: when I look at a portrait on a canvas I have no assurances that the model who posed for that portrait resembles the painting ... or, for that matter, if any model ever existed.
Now, it might be an interesting question to ask if an image that had been totally generated by an algorithm, i.e., having no counterfactual analog in the outside world, is more like a photograph or a painting?
I suppose in some sense this arrangement of bits on a metal substrate, i.e., encoding image data onto the sensor, is similar to creating a "latent image" on film?
It actually works a bit different than that. An exposure on film is a direct capture of what the lens provides it, which then has to be subjected to a chemical process in order for it to be visible to the naked eye.
A digital sensor is more of an antenna which detects the wavelengths of light, which is then recorded as values which go thru several stages of "conditioning", which is then used to construct the raw image data using a separate process.
It then goes thru a final conversion process which produces the complete, recognisable image.
Through photoshop, one could get both…!
Thank you for the clarification! I assume that once the electro-chemical process completes, an algorithm (or several algorithms) implements the "final conversion process?"
My question is, what format is more accurate in its processing forming the final image…?
My question in response is how "accurate" does the process need to be in order for the photographer to create "the image"? The history of photography is replete with many essential images that are far from "accurate" if we measure accuracy by information loss between what was at the moment of the photograph's creation and the final product. I suspect that the answer to this question depends entirely upon the photographer's imagination first and process, etc., in any particular order.
These kinds of questions have been part and parcel of the discussion since the invention of photography. Improvements in technology are important, but not tantamount to the many other elements, some outside of the photographer's control, that come into play in the fate of the image.
My question is, what format is more accurate in its processing forming the final image…?
My question is, what format is more accurate in its processing forming the final image…?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?