One of the things I learned in my career, was that manufacturers sought tighter tolerances. If the high/low limits grow, it becomes more likely that a part with a low variation may eventually not work with another part which has a high variation.
On the Science Channel, there's a series titled "How It's Made". Now, you're not going to become an expert on any of the subjects they show, unless you already have it. However, one of the things they highlight is the constant computerized quality checking now done in a lot of products. In many instances products not meeting tolerances will be sent back for rework automatically, without human intervention. One of the programs showed how camera lenses were made, and the process is remarkable.
I worked for a major auto manufacturer for 38 years in finance and IT, and our bosses wanted us to be familiar with all phases of manufacturing, not necessarily to work there, but how to recognize changes in engineering and manufacturing. In our plants, the torque guns used to drive screws and nuts send torque settings for every screw, nut and bolt installed, to file servers so the vehicle has a complete history of manufacturing. Now, most of us can't imagine why a door panel screw torque history, but someone needs it, either to improve the process or design.
Door panel screw torque history sounds like both a good way to cover people's backsides/find a scapegoat in case something goes wrong, as well as providing a job-justification for someone... "CEO's nephew needs something to do? ehh... Go have him track how much torque we're putting on those door panels..."
But tighter tolerances typically are sought either for increased precision and consistency of the end product (as a way to command a better market price) or as means of reducing costs (by reducing the requirements of hand fitting and increased interchangeability of parts).
However for lens assemblies I can see scenarios where automated digital testing of individual lens components may have allowed the industry to loosen up on overall tolerance levels to achieve greater overall throughput on production without a negative impact to overall quality - Manually sorting lens elements so that you can combine them into packages that give acceptable image quality would be difficult and costly if they were so loosely speced as to have one element in the series be unlikely to have a 'good optical fit' with a randomly selected example of the next element type in the lens design, but electronic binning of parts with automated testing could allow you to accept lower overall tolerances in machining and finishing of individual elements because you can then
select the tolerance after the fact
Rather than having to settle for binning as:
- garbage - too low of value on spec
- usable - on spec
- garbage - too high of value on spec
You can expand it to something like:
- garbage - too low of value on spec
- usable - on spec A
- usable - on spec B
- usable - on spec C
- garbage - too high of value on spec
Where A and C spec'ed elements were previously either tossed as garbage or would be a potentially 'poor, but still acceptable' fit.
Similar to how automated testing and verification methods for complex Integrated Circuits allowed the industry to push boundaries more by binning parts according to the specifications they matched after manufacture, rather than aiming for monolithic specification defined before production: If the part met spec for top of the line Bin A, it could be sold as the premium product, but a common flaw would let a chip be spec'ed as a Bin B, worse issues as Bin C, etc. - Where as the earliest production and testing methods meant you aimed to make stuff fit to Bin B, and it was either sold as a Bin B, or it was tossed in the trash.
Which is a very long winded way of saying:
I wonder if we took 50 random canon EF-s 18-55mm lenses made in the last 5 years, and 50 lenses of a comparable market target lens design from the 1950's, would we actually have a noticeable difference in tolerances, and which way have they drifted?