You rightfully put quotes around "digital camera" in the description of this "product". 32x32 pixels and 4 bits/pixel would barely qualify as imaging device of any kind. Even the Kodak experiment from 1975 turned out a high res camera compared to this.
The first PC I got in touch with was in 1985. It had 1MB of RAM, an 8088 processor running at 4.77MHz and a 20MB hard disk. There may have been multi megabyte storage media a decade before that, but "handy" is not a term I would use for them. If we assume, that one needs at least 100.000 pixels to obtain a somewhat acceptable image, even that brand new expensive PC from 1985 would have been woefully inadequate to handle a series of images.
Digital photography progressed from "haha that's funny" to "hey I recognize something" to "wow, is that me?" to something, which could do at least newspaper pictures in the 90ies. It was up to Kodak's management to draw a curve through these data points, and they chose not to draw an exponential curve, thereby totally missing the market a few years later. In the late nineties they built a high volume coating facility for film instead of preparing for an organized downscale while trying to monetize the upcoming digital technology. That "exponential growth in performance" thing is Moore's law, and Kodak's management was either unaware of it, or chose to ignore it.
Moore's law is about everything, from CPU power to RAM size to magnetic storage device size to choice of storage medium. It's about the whole computer system. And it's an exponential law, unlike most aspects of chemical engineering.
The Intel 8088 CPU in that PC was built with 3µm technology, which is larger than pixel width of most digital sensors today, including APS-C and full frame dSLR image sensors. Now try to build an 8 bit ADC (as used in CMOS sensors) next to each such pixel! Nope, a 3µm process would not cut it.
Moore's law does not mandate smaller device structures or anything like this. Moore's law simply states "device performance doubles every 18 months", and the reasons vary a lot. We hit all kinds of road blocks on the way (device structure smaller than UV wavelength, hard disks not speeding up any further, bus speeds unable to rise infinitely, ...) and that thing keeps moving and moving forward. Storage technology changed, CPU architecture changed, viable chip size increased, acceptable cooling effort increased, lithography changed, ....
100 x 100 x 4 bit is not a whole heck of a lot better.
One was a prototype, the other was an actual product.
Sassons camera was "just" on a long list of image digitising devices going back to at least the fifties.
He was perhaps to the first to file an actual patent for a portable consumer digital camera, but far from the first to think of it, or make experimental prototypes of it.
"Acceptable images" varies quite a lot with the use case. The Quantel buffer, PaintBox, SuperPaint, Cromemco and later Amiga in HAM compressed frame buffer mode made very acceptable photo representations on standard video equipment. Early successful digital cameras was not very high resolution either.
Something like the original Sony Mavica with a shoulder strap disc drive would have been very possible and usable in the early 80s. And indeed a similar system was used for newspaper print just a few years later, for Olympic reportage.
Video just got all the attention for various reasons.
For many years, especially mandated by the process of CCDs, all the early DAC logic was on a separate chip. And was just as good for it.
in fact you could argue that it doesn't matter that much whether the signal rolled off the CCD was stored in analog form on tape or disc first. Same signal with only very slight degradation, by the "middle man".
Moores Law was always an educated informal guess and based on observation of ICs in 65 and revised in 75, so based on ECL, TTL and early MOS. And very important it is to a large degree self-fulfilling.
Various "exotics" (at the time and for decades later) follows different trajectories.
It had little to do with performance as such.
In fact some of the highest performance computers of the time had quite low integration.
What I found with Kodak is that they never put any effort into saying that film was in any way better than digital. I don't know that they ever attempted to convince consumers that they should keep using film - did they ever advertise in that way? I know that they did advertise their own digital cameras and printers a lot. All marketing from every direction in the early 2000s was "buy a digital camera". And, of course, when social media came along, that solidified a reason to do so.
Kodak remains true to form with their lack of interest in supporting their remaining customers (like the people here) by making more film and maybe a selection of paper. Well, they did rerelease Ektachrome - a slide film - with much fanfare - a film you can't enlarge. Obviously, it's so you can post Ektachrome scans to Instagram. That's feeding a fad - not supporting a practice.
I wondered the same.
Maybe they where caught in a case of put up or shut up?
They had no real way of guaranteeing the superior quality to the customer.
Accessible and affordable scanners where woefully inadequate. Still is for that matter.
Mr. & Ms. Mom had gotten used to the dreadful drugstore prints, that started out as glorified contact prints to see what you
really wanted printed, that ended up being the final and only imaginable product from film for the vast majority of consumers.
Don’t knock Ektachrome though, it's a tremendously good film that perfectly compliments Fuji Films offerings.
Amazing latitude for slide, superb colors, and perhaps the sharpest colour film Kodak currently offers.
And you enlarge it by putting it in a projector.
Because of the low grain, reversed image and not being insanely dense like the Velvias, it’s one of the most scannable films.
With a good scan you can “enlarge” as much as you want.