It has always been a given for someone who grew up in the 21st century, knowing the accepted limits of what to expect from a photographic emulsion.
Films technical refinement hasn't reached new heights in decades now, and there are different flavours for different shooters, whether that be conventional cubic emulsions or tabular grain films for example. and we also know exactly what QC and consistencies to expect from mainstream manufacturers too.
But something I've often wondered is, how has this changed over time? Especially during the early years when emulsions being coated onto "film" was a new concept in of itself. And ultimately how that translated for the shooter and processor.
Because no matter how good the camera was, the core limitation largely comes down to the film itself.
Over time I have heard about early emulsions, most recently for me was learning what the first emulsions kodak sold in 35mm were. It seems unbelievable to me that b&w stocks topped out around 50 iso, and given names such as "super sensitive". Even kodachrome was a mere 10 iso originally.
So, were these films that were once considered the best of the best that great by modern film standards?
Were they more delicate to handle? curl like buggery when drying? handle over and under exposure like modern stocks? entertain a level of sharpness and grain remotely comparable to now?
What was QC like? how did things evolve during and after major world events like both world wars? and when did evolution taper off to where we are today?
I`m really not an expert on this and i only know some details, but as i`m also interested in movie cameras a little i can give you some fragments:
Basically everything was worse. The more you go back in time the harder it was to take a good picture. Speed was lower, grain was bigger, spectral sensitivity was orthochromatic or even less, developers were more toxic, emulsions did not have as long expiry date and were more prone to storage conditions before development.
When the movie camera was invented in about 1895, a big problem was that a movie camera running at 8fps (more or less standard speed at the beginning of cinema) had an exposure time of about 1/16 of a second.
This was too short exposure time for a lot of current emulsions, so they had to increase speed of the emulsion somehow - resulting in the emulsion not even being orthochromatic but blue-sensitive for the most. Actors having red lips, normal red cheeks or just a slight sunburn would have dark grey or even black faces on the screen. So they powdered the entire face white - and then used blue lipstick and makeup to get the eyebrows back etc. .
When people today see an early movie they sometimes think that there had been a strange fashion of make-up these days, but actually it needed to be done because of early movie film being blue-sensitive for the most.
There were entire make-up sets, containing different shades of blue lipstick, powder etc. and manuals how to makeup an actors face when using blue-sensitive film stock.
This had to continue until panchromatic film got standard, because orthochromatic film isn`t sensitive to red, so red lips will render deep black and reddish cheeks will render dark grey - and a slight sunburn, well, will make you look like an african. I once used orthochromatic movie film and when i saw myself, i had a slight sunburn, it was hilarious because i looked like i had natural black skin.
I don`t want to offend anybody or get into politics, but sometimes i wonder whether black actors also were rejected in the early days because it would have been impossible to film their true skin color. Sometimes, in an early movie there is a black actor having a minor part and often you cannot really recognize the face. You can see the white of the eyes and of the teeth, but everything else is deep black. To get this on early movie film you also had to powder the entire face white, and then use an appropriate blue-tone powder to get the face darker - but not too dark to loose the countenance of the face, but not too bright having the face look like a white actor. Tonal range of early films was smaller, making it really hard to get this right.
Afaik because of this the 35mm still camera was invented.
It does take perforated 35mm film, just like a movie camera, and was used for test shots. You put some of the movie film into the still camera and take some pictures of the setting, to check for the lighting and tonal range, probably also of the actors to check how their blue makeup would render on the screen - but probably also to test the emulsion itself because there was much bigger inconsistency in film production. The box of the film maybe said 10ISO, but actually this roll had like 5 or maybe 13ISO, tonal range and base fog may differ, also depending on how old it was or how it was stored and shipped. Too warm, too damp or too aged - and speed was reduced, more base fog etc. .
Also you had to develop soon. Today you can develop half a year or even a year after exposure without loosing the image on the negative or suffering speed-loss or base fog, back then you should develop within a few days. Current Kodak Vision 3 500T still says in the manual that you should develop as soon as possible after exposure - as 500T is a pretty high speed color film - back then a slow B&W film was about as sensitive to storage conditions and point of development as a high speed color film today, or even worse.
Also you had to pay more attention to what (chemical) fumes the emulsion was exhibited to. I once had an old book about photography from about 1905 - where they still were using glass plates and wooden plate holders. There was advice on how to restore a used wooden holder - and it adviced to only use a certain special paint, which chemicals would interact less with the emulsion on the glass plate - and it adviced to let the fresh painted holder rest open for several weeks or even months (!) to let the chemicals of the paint evaporate, so it wouldn`t damage the properties of the emulsion on the glass plate when inside the holder for a few hours only sometimes. And develop as soon as possible after exposure.
So, there were a lot of problems and a lot of improvements.
Today improvements probably have slowed down, but Kodak for example still is improving. Their Kodak Vision line started maybe in the late 90s, then got improved to Kodak Vision 2 and then to Kodak Vision 3. Grain got smaller, colors got more natural etc. .
So still there is improvement, probably also because it is easier to improve an emulsion by using computers for calculations and so on. Environmental laws led to ban of certain toxic raw chemicals, manufacturers had to alter some of their emulsions because of this anyway, but quality of films did not really suffer with every manufacturer and type of film.
Some films even got better - while becoming more environmental friendly. This could bee seen as the actual evolution of film.