New lab scanner option: Auralab

Recent Classifieds

Forum statistics

Threads
197,574
Messages
2,761,307
Members
99,406
Latest member
filmtested
Recent bookmarks
0
OP
OP
Jon Buffington

Jon Buffington

Subscriber
Joined
Jun 23, 2014
Messages
669
Location
Tennessee
Format
35mm
Yes I think so, all the electronics and PC software are finished and working. I’ve designed the whole mech design in cad, just a case of making it and seeing what’s wrong 😂
I am seriously interested in this! Please keep us updated. My ancient pakon needs a new companion.
 

Adrian Bacon

Member
Joined
Oct 18, 2016
Messages
2,086
Location
Petaluma, CA.
Format
Multi Format
20-25k for this is insanely expensive considering the manufacturing cost.

I would like to know if others would be interested in a product like this if it cost around 1k. I designed all the electronics and software for a scanner like this a few years ago. It's all working but I haven't assembled the mechanical parts yet. I could finish this off and get it on kickstarter with not a lot of effort. What do you lot think?

My current setup all for all the hardware is less than $10K, but well north of $5K, and I can average digitizing a roll of 36 exposure 35mm film in less than 5 minutes with an additional ~15-20 minutes to post process the roll, make any adjustments to fix the white balance, exposure, etc, spot out dust, and get deliverable files packaged up ready for upload. The realistic cost of the code I run to realize all this is about the same as the hardware cost if you remove the time spent with me just being stupid and figuring things out, so all in, I'd not be willing to spend more than ~$15K unless it was faster, or required less of my attention.
 
Joined
Mar 3, 2011
Messages
1,507
Location
Maine!
Format
Medium Format
My current setup all for all the hardware is less than $10K, but well north of $5K, and I can average digitizing a roll of 36 exposure 35mm film in less than 5 minutes with an additional ~15-20 minutes to post process the roll, make any adjustments to fix the white balance, exposure, etc, spot out dust, and get deliverable files packaged up ready for upload. The realistic cost of the code I run to realize all this is about the same as the hardware cost if you remove the time spent with me just being stupid and figuring things out, so all in, I'd not be willing to spend more than ~$15K unless it was faster, or required less of my attention.

That amount of time is a complete non-starter for any lab. 25 minutes per roll of 36 exposures? You can do only 20 rolls a day??? Assuming you're working 10 hours???

One of the scanners in development I'm aware of does 7.5 frames at roughly 6000 pixels in the long dimension per second. That's a workflow. Hell our old HS1800 can do a roll of 36 exposures, to 8x12" deliverable scans, in less than 2 minutes.
 

Adrian Bacon

Member
Joined
Oct 18, 2016
Messages
2,086
Location
Petaluma, CA.
Format
Multi Format
That amount of time is a complete non-starter for any lab. 25 minutes per roll of 36 exposures? You can do only 20 rolls a day??? Assuming you're working 10 hours???

One of the scanners in development I'm aware of does 7.5 frames at roughly 6000 pixels in the long dimension per second. That's a workflow. Hell our old HS1800 can do a roll of 36 exposures, to 8x12" deliverable scans, in less than 2 minutes.
It depends on the size of the lab, and if that's all that they do. Processing and scanning film isn't my only revenue stream. If that's all I did and I had enough film coming in, I'd be making huge investments into getting finished files way faster. A few years ago I had a lot of mail order business, however, enough new labs have popped up where people didn't have to mail stuff in that a lot of that business has died down, at least in the SF Bay area. Hell, I get a lot of business just from people who don't want to drive one town over to drop their film off to the lab they were using before they found my lab, much less mail it out and pay for shipping. Either that, or a lot of people started home developing, which given how many home processing kits I sell, has been a booming business.

All that said, that is for my high end scans, and that is the max time, including having finished files that are packaged up and ready for upload in the size and format that the client wants. It's often way less time than that. Normal small scans aren't done with that setup, I use a Pakon scanner for that, which is mostly unattended and produces finished files fairly quickly. While I do accept mail orders for pretty much anyone who wants to send film in, the bulk of my business as of late has been locals, and given that there's another local lab less than an hour drive in pretty much every direction from my lab, I just don't have a need to process and scan hundreds of rolls a day every day. Yes, there are busy times where I'm totally slammed and have quite a backlog, but normal day to day, it's just one of the several revenue streams I tend to.
 

Steven Lee

Member
Joined
Jul 10, 2022
Messages
1,398
Location
USA
Format
Medium Format
One of the scanners in development I'm aware of...
Really cool! I am not asking you to violate any NDAs, but since you mentioned the existence of several scanners in development, do you mind sharing a rough preview of what may be coming? The model you've described is optimized for volume, but have you heard of a modern take on Flextight or even a Coolscan? Something like 10-15 minutes per roll and true 5,000DPI for both 120 and 135?
 

gswdh

Member
Joined
Mar 12, 2022
Messages
56
Location
Europe
Format
35mm
Really cool! I am not asking you to violate any NDAs, but since you mentioned the existence of several scanners in development, do you mind sharing a rough preview of what may be coming? The model you've described is optimized for volume, but have you heard of a modern take on Flextight or even a Coolscan? Something like 10-15 minutes per roll and true 5,000DPI for both 120 and 135?

In all honesty, 5000dpi is really overkill for nearly any film. You’re not going to get much use out of anything above 3000. Using higher resolution makes the pixel size smaller for any given equal sized imager which will have other issues like higher noise and slow readout times.
 

Anaxagore

Member
Joined
Jun 1, 2005
Messages
130
Format
Medium Format
Let's wait for the first prototype. As of now, all I see is a "non contractual" 3D render and unverifiable promises.

There was a prototype demonstrated at the Paris photo show about a month and a half ago. Not fully functional yet regarding the image processing but the main functionalities were there.
 

Steven Lee

Member
Joined
Jul 10, 2022
Messages
1,398
Location
USA
Format
Medium Format
In all honesty, 5000dpi is really overkill for nearly any film. You’re not going to get much use out of anything above 3000. Using higher resolution makes the pixel size smaller for any given equal sized imager which will have other issues like higher noise and slow readout times.
Grain appearance is notably more realistic and pleasant at higher magnifications at 3k dpi vs 5k, especially when developers like Rodinal or Infosol are used. In fact, I believe that low resolution scanning produced the myth of Rodinal's "large" grain. People who wet print tend to have a better opinion of those developers.
 

Adrian Bacon

Member
Joined
Oct 18, 2016
Messages
2,086
Location
Petaluma, CA.
Format
Multi Format
Grain appearance is notably more realistic and pleasant at higher magnifications at 3k dpi vs 5k, especially when developers like Rodinal or Infosol are used. In fact, I believe that low resolution scanning produced the myth of Rodinal's "large" grain. People who wet print tend to have a better opinion of those developers.

I’ve digitized *a lot* of film, and can say a few observations based on reality…

1. there’s a reason why Kodak/Pakon/Nextlab never increased the resolution of their dedicated lab film scanners beyond 2000x3000 pixels and instead focused on making later versions simply faster. At typical display resolutions and viewing distances, higher resolution is very much diminishing returns.

2. it’s extremely easy to zoom in on a computer and declare more resolution as superior or nicer rendering of the grain. I have equipment and technical expertise to digitize 35mm film at well over 5000dpi, and yes, at that resolution, when you zoom in and inspect the grain at effective magnification levels higher than ever, the grain renders oh so nicely. It’s fantastic. Unfortunately, it’s not how I or any of my clients actually look pictures. We look at *the whole picture*, and all that awesome detail is effectively lost as soon as you zoom out to look at the whole picture on pretty much every display device available to us. This is reality.

3. for 35mm film, even 2000x3000 pixels has quite visible grain when you zoom in. A scanner that actually resolves a solid 2000+ dpi is going to render that grain pretty crisply. Higher dpi is very much diminishing returns when viewing the whole image on most displays at most viewing distances.

4. the vast majority of people who shoot film today don’t care about grain unless it’s particularly egregious looking, which rarely happens with currently available film. We here at Photrio are generally operating inside a bubble that isn’t really reflective of the current film consumer marketplace reality.
 

Steven Lee

Member
Joined
Jul 10, 2022
Messages
1,398
Location
USA
Format
Medium Format
@Adrian Bacon You are not going to convince me to stop trusting my eyes. I view my scans on high-DPI "retina" 27" display. This is the norm in 2023. High resolution scans of 35mm HP5+ negatives show up visibly crisper and their grain is visibly "tighter" than common lab resolutions deliver. I am talking about simply hitting the full screen button in Apple Photos. No need to zoom or pixel peep.

This is objective reality.

Your message of hitting diminishing returns beyond a certain good-enough point is well received in principle. But it did not land. 3,000dpi is simply not enough.
 

Adrian Bacon

Member
Joined
Oct 18, 2016
Messages
2,086
Location
Petaluma, CA.
Format
Multi Format
@Adrian Bacon You are not going to convince me to stop trusting my eyes. I view my scans on high-DPI "retina" 27" display. This is the norm in 2023. High resolution scans of 35mm HP5+ negatives show up visibly crisper and their grain is visibly "tighter" than common lab resolutions deliver. I am talking about simply hitting the full screen button in Apple Photos. No need to zoom or pixel peep.

This is objective reality.

Your message of hitting diminishing returns beyond a certain good-enough point is well received in principle. But it did not land. 3,000dpi is simply not enough.

I expect you to do what works for you, and if that works for you, that's great.

I'll just leave you with a final thought. The human visual system is optimized to detect high contrast edges. The higher the contrast the edge, the more we perceive it as sharp and finer detail. Your human visual system is very easily fooled and is therefore not objective, but rather quite subjective. Display manufacturers and software makers know this and optimize their workflows to play into that. For example, many modern monitors have a sharpness setting that you can tweak. If the monitor itself wasn't doing a bunch of processing to the image data it was being fed from the display card, there would be no need for that, yet, there it is, because it is doing what image processing the manufacturer thinks is best to deliver the most crisp and pleasing image to the viewer, regardless of what what done to the image before it ever made it out of the video card. The reality is, unless you have a reference display that has been measured to deliver accurate results, what you see on a monitor is quite subjective.

You say 3000 dpi is simply not enough. Fair enough. Have you actually been able to compare actual 3000 dpi scans where the scan resolution was actually 3000 dpi? How do you know that it was actually 3000 dpi, and more importantly, 3000 dpi with enough contrast that our visual system would actually register the detail? Total object contrast below 50% tends not to register with our visual system. This doesn't mean there's no detail, it means we don't see it without sharpening added to bring that contrast up into our detectable range. Conversely, you'd also be amazed at how often people think something is high resolution because it's really sharp, when in fact it doesn't actually have that much fine detail, it was simply sharpened by somebody who knows how to trick the human visual system into thinking it's looking at high resolution.

I don't know anything about how you're getting your high resolution scans, but I'd ask how do you know your scans are actually more than 3000 dpi? I've had more than one person send me film as a test and ask me why my supposed lower resolution scans appeared to have more detail and be sharper than their own higher resolution scans of the same frames. All I was able to say was perhaps you weren't capturing as much detail as you thought you were, so even though it was the highest resolution you've been able to achieve to date, it wasn't as high as you thought, and your actual resolution was actually a really solid ~3000 dpi. Just saying. Resolution and fine detail and sharpness is objective if measured with instruments, but not so much with our eyes. It's shockingly easy to fool our eyes.
 

Steven Lee

Member
Joined
Jul 10, 2022
Messages
1,398
Location
USA
Format
Medium Format
Adrian, let me save you some typing. I have been working with digital imaging chains (in lithography) for most of my career and quite familiar with how everything works. I also happen to have negatives that have been scanned on variety of equipment from cheap Epson flatbeds to Coolscans, Flextights and Creos, with known true DPI characteristics measured with USAF targets. I have also built several camera-scanning rigs using sensors ranging from 24MP to 100MP with pixel shift and without.

And when I say that scanning at 3,000dpi produces distorted grain appearance (which gets exaggerated by various digital "enhancements" at various stages of the imaging chain you're referring to), I mean exactly that. HP5+ scanned at 3K DPI will look like a coarse shit in almost all apps on all monitors. But if you have a high DPI display and a good renderer like Apple Photos, a high resolution scan will look absolutely gorgeous. It will look just as good as a wet print of the same size at a reasonable distance. Besides, there's always a high-magnification loupe one can examine their "grainy" HP5+ negative to realize that something is wrong with their scan.

Some refer to this issue as grain aliasing, but I have never seen a comprehensive definition of the term. But the basic idea is not hard to visualize: when your resolution is below a certain level, smaller-than-threshold grain particles get lumped together into weird looking clumps with unnatural patterns, and god help you if you apply sharpening and excessive JPEG compression on top.
 

Adrian Bacon

Member
Joined
Oct 18, 2016
Messages
2,086
Location
Petaluma, CA.
Format
Multi Format
Adrian, let me save you some typing. I have been working with digital imaging chains (in lithography) for most of my career and quite familiar with how everything works. I also happen to have negatives that have been scanned on variety of equipment from cheap Epson flatbeds to Coolscans, Flextights and Creos, with known true DPI characteristics measured with USAF targets. I have also built several camera-scanning rigs using sensors ranging from 24MP to 100MP with pixel shift and without.

And when I say that scanning at 3,000dpi produces distorted grain appearance (which gets exaggerated by various digital "enhancements" at various stages of the imaging chain you're referring to), I mean exactly that. HP5+ scanned at 3K DPI will look like a coarse shit in almost all apps on all monitors. But if you have a high DPI display and a good renderer like Apple Photos, a high resolution scan will look absolutely gorgeous. It will look just as good as a wet print of the same size at a reasonable distance. Besides, there's always a high-magnification loupe one can examine their "grainy" HP5+ negative to realize that something is wrong with their scan.

Some refer to this issue as grain aliasing, but I have never seen a comprehensive definition of the term. But the basic idea is not hard to visualize: when your resolution is below a certain level, smaller-than-threshold grain particles get lumped together into weird looking clumps with unnatural patterns, and god help you if you apply sharpening and excessive JPEG compression on top.

I will respectfully disagree with your assertion. If there’s one thing I’ve learned after all these years, it’s that often what we thought we knew isn’t always what actually is until it’s actually been tested with a fair amount of rigor, and we did everything we could to be aware of and leave our own bias at the door, and be prepared to learn something we didn’t know before, because often times, that’s what happens. I have no doubt that you are experienced in what you know, and you clearly think you know more than I do about these things, which I’m totally fine with. You very well may, I can only take your word for it at this point, because I don’t know who you are and don’t have any interaction with what you know outside of this thread. I don’t have anything to prove here, and couldn’t care less if you even know who I am, or even know about any of the information or actual testing results I’ve shared here on Photrio over the years. All I can do is what I’ve done in the past, which is do some actual testing, post the results, and let others come to their own conclusions after looking at the data.
 

Steven Lee

Member
Joined
Jul 10, 2022
Messages
1,398
Location
USA
Format
Medium Format
I did not mean to imply that I am more knowledgeable than you, @Adrian Bacon. I apologize if you feel this way. I just did not feel like going on tangents of how we perceive detail or how monitors and sensors work. Instead I have been zooming into a very specific use case: the grain appearance of 35mm ISO 400 scans when viewed full-screen on "retina" displays continues to benefit from scanning resolution increase beyond 3000dpi. We can agree to disagree on how significant those improvements are, but I did not want to expand the scope into fine image detail, printing, low-res monitors, mobile, etc.
 

Adrian Bacon

Member
Joined
Oct 18, 2016
Messages
2,086
Location
Petaluma, CA.
Format
Multi Format
I did not mean to imply that I am more knowledgeable than you, @Adrian Bacon. I apologize if you feel this way. I just did not feel like going on tangents of how we perceive detail or how monitors and sensors work. Instead I have been zooming into a very specific use case: the grain appearance of 35mm ISO 400 scans when viewed full-screen on "retina" displays continues to benefit from scanning resolution increase beyond 3000dpi. We can agree to disagree on how significant those improvements are, but I did not want to expand the scope into fine image detail, printing, low-res monitors, mobile, etc.
That's fine, however, your experience is specific to you. The vast majority of my clients don't have your experience, and I'm sure my client base isn't unique. They experience scans on their phones. Many of them don't even have computers (shocking I know, but that's how it is with many of the younger generations, they either use phones or tablets or both, nary an actual computer in sight, I can't even imagine getting through a day with nothing but a phone or tablet, but they seem to manage to do so with few if any problems), so you have a fairly high resolution display to look at scans on (as do I), but that's not really reflective of the broader film user base.

For my own personal work, I absolutely favor higher resolution over lower resolution, but I'm also aware that it's extremely easy to get caught up in the more bigger, better, mindset, and even easier to think that something is actually perceptually better because it's better on technical specs alone and ignore or discount anything else that doesn't agree with that view. That's a shockingly easy hill to choose to die on, but based on my own experience, not everything is always as it appears in the specs, and it's often worth it to take a step back and go "is that *really* the case, or do I have some confirmation bias going on?". This is why I've done a fair amount of testing where I just throw out all assumptions about what is acceptable to try to suss out the details of where diminishing returns really is, and what is sufficiently sharp and detailed for *most use cases* and am quick to caution others to not be so fast about discounting anything but high resolution scans as acceptable. They might not be acceptable to you for various reasons, but that doesn't mean they're not acceptable for anybody else, and that's OK.
 

Adrian Bacon

Member
Joined
Oct 18, 2016
Messages
2,086
Location
Petaluma, CA.
Format
Multi Format
a very specific use case: the grain appearance of 35mm ISO 400 scans when viewed full-screen on "retina" displays continues to benefit from scanning resolution increase beyond 3000dpi.

I'm breaking this out to a separate line of comment because I think it bears a little exploration, and is the main reason I don't agree that more resolution necessarily nets better rendering of the grain with high resolution displays. In a previous post, you referenced what I'm assuming to be Apple's 27 inch Retina display which has 5120x2880 pixels if I'm not mistaken. This means any image I want to display at full screen will get scaled to be no more than 5120 pixels wide or 2880 pixels tall, so a full 36x24mm frame of film to be displayed full screen will either have some black bars on the side, or have a bit of the top and bottom cropped off, whatever your preference is. If you do the black bars on the side, that means you're only seeing 24x36mm of film frame scaled to 2880x4320 pixels, or effectively 3048 dpi. If you crop off a bit of the top and bottom so that your 36mm frame width is the 5120 pixels, that number slightly increases to ~3600 dpi.

So, if I have a scan of a film frame that I scanned at 3000dpi, and one at 4000dpi and cycled between the two of them on that display, how much of a difference do you think you'd see? If you have more than 3000-3600 dpi of information (could be 4000dpi, 5000dpi, any value really), it's going to get wiped away when it's scaled down to the display resolution. The grain very well may appear sharper and better formed at the display resolution, but that's because your total object contrast got boosted to nearly 100% across the entire frame for every single line pair due to being scaled down from a higher resolution, but if I actually had 3000 dpi on my 3000dpi scan I can get nearly the same if not the same effect by boosting the total object contrast through appropriate sharpening. It's totally a perceptual thing. There isn't actually more information there at the display resolution, it's just more visible to us due to how our visual system works.

At the end of the day, I'm still only seeing 3000-3600 dpi worth of information on that display when looking at a 24x36mm scan of a film frame at full screen. Adding more resolution to the scan doesn't do anything but boost the total object contrast at the display resolution, and at some point, once that's maxed out everywhere, more resolution doesn't do anything. I'd even offer a really well executed 2000dpi scan with appropriate sharpening would look pretty good on that display. This is why I say more resolution is diminishing returns.

If you're saying a 3000 dpi scan looks like garbage, then I'd say there's something wrong with how it was scanned and you don't actually have 3000dpi.
 

Steven Lee

Member
Joined
Jul 10, 2022
Messages
1,398
Location
USA
Format
Medium Format
@Adrian Bacon If you want to discuss it on this level, fine. It is helpful to separate two steps here: digitization and display. Both affect grain appearance.

Putting monitors aside for a moment, the image file data will have distorted grain due to the phenomena frequently called grain aliasing, but generally all A/D converters introduce this distortion, subject to sampling frequency AKA resolution. De-Bayering artifacts also contribute to this.

Then you have the issue of reproduction. The dimensions of an image never matches the physical dimensions of a monitor. Software image renderers have gotten pretty sophisticated at down-sampling bitmaps to RGB pixel grids, and generally they perform better when they have extra pixels to work with. Apple Photos is my favorite. The same image looks way better in it than in a browser or in Windows Photos on the same monitor.

So here we have it: a fairly complex interaction of digitization artifacts and artifacts produced during image rendering. I find it fairly easy to see that higher-resolution scans render more naturally than lower res ones, again - without pixel peeping. I have no doubt that most people don't care, aren't we supposed to be enthusiasts here? :smile:
 

Adrian Bacon

Member
Joined
Oct 18, 2016
Messages
2,086
Location
Petaluma, CA.
Format
Multi Format
Putting monitors aside for a moment, the image file data will have distorted grain due to the phenomena frequently called grain aliasing, but generally all A/D converters introduce this distortion, subject to sampling frequency AKA resolution. De-Bayering artifacts also contribute to this.

I hear this a lot, but have only seen real aliasing with very specific scanner and software combinations. It can happen in very specific circumstances, however, the entire point of optical anti-aliasing filters is to stop that (and they are very effective at doing so), and most optical systems have some form of that in the optical imaging pipeline for exactly that reason. There are a few very specific cases where it happens with some scanners and software, but true aliasing of the grain (or any fine detail because that's what grain effectively is) is a sign of a very poor optical design and shouldn't be used. Any kind of aliasing of random fine detail is pretty hard to get to happen on most camera systems, and if it's a bayer based array, the act of debayering the sensor to get down to RGB actually erases lots of random detail, and amplifies certain types of artifacts that can happen with regular patterns that are close to the frequency of the anti-aliasing filter's bandpass. Film grain doesn't usually follow what I'd call regular patterns, so it's not something that regularly pops up when using a digital camera to scan film. Also, why you'd debayer the sensor data when scanning black and white film is beyond me. The film has no color information, so after you white balance the raw sensel values for the color of the light you're using, just treat the whole thing as a monochrome image. It's amazing how much fine detail gets lost through the act of interpolating out all that stuff to get to RGB values of black and white film. Goes back to the whole not having as much resolution as you think you have thing. Unfortunately, unless you write your own raw processing software to only white balance the black and white film samples then treat as monochrome, pretty much all other raw image processors will do that to your sensor samples and completely destroy a whole pile of fine image detail into a blurry mess.

If you're going to chalk what you're describing up to grain aliasing, then maybe we should define exactly what aliasing is: Aliasing is when you sample data at a spatial frequency lower than the spatial frequency of the data you're sampling. If your optical system has no anti-aliasing filter this is shockingly easy to do and you'd see it popping up all over the place with all camera systems. Fortunately, that's not the case because most if not all imaging systems have an anti-aliasing filter in front of the sensor so the sensor never sees spatial frequencies that are greater than its spatial sampling frequency. Combine that with the fine detail destroying bayer demosaicing that generally goes on, and you just have less resolution.

So, if what you're describing isn't grain aliasing (it's not, unless you're doing something horribly wrong), then what is it? It's just simply less resolution, and as a result, blurrier grain when scaled up to view it on higher resolution displays.

Software image renderers have gotten pretty sophisticated at down-sampling bitmaps to RGB pixel grids, and generally they perform better when they have extra pixels to work with

Actually, there's only a handful of image rescaling algorithms that have been in use for a very long time. I've implemented most of them in my own code. The differences you're seeing there are just different algorithms being used. I don't know what one Apple uses, but I'd imagine it's in the Centripetal Catmull-Rom family as those generally provide the best all around general performance, but are relatively slow compared to other algorithms. Don't know what Microsoft uses, but if it looks worse than Apples, probably not the same algorithm. I won't disagree that they perform better with more pixels to work with, but the better comes in the form of boosting the total object contrast of the resulting detail, not in putting more detail into the final image because that is impossible to do.
 

Steven Lee

Member
Joined
Jul 10, 2022
Messages
1,398
Location
USA
Format
Medium Format
@Adrian Bacon and your point is? I too never liked the term "grain aliasing" but that's what most frequently used to describe subpar grain appearance of low-res scanning. Grain gets blurrier and adjacent particles get merged into larger ones, the result is artificially sharpened, and then downsampled to fit a target screen's RGB grid. And the final result appears coarser regardless of physical image area on a monitor. I can't quite understand what exactly you're disagreeing with?
  • Some users don't care. Well, sure.
  • My scans are not exactly 3000dpi. But as I said, I've used numerous equipment (and scanning services) in the past, with well-known specs. I also mounted the same lens on progressively higher resolution sensors in 24-100MP range and, all other variables being the same, grain appearance continued to improve with resolution.
Anything else I'm missing? You sound as if I'm criticizing your services, but I have never seen your scans (if 3000dpi your target then it's purely a coincidence) and maybe I'm just not your target audience? My original post which asked for a 5,000dpi device specifically referred to low-volume enthusiast grade scanner like a modern X1. Apparently a scanner with such specs was a good idea back then, and I am a bit lost why you think it's not a good idea now.
 

Adrian Bacon

Member
Joined
Oct 18, 2016
Messages
2,086
Location
Petaluma, CA.
Format
Multi Format
your point is?
My point is I wish people would stop using the term grain aliasing and instead just articulate that they don't like the way the grain looks with a given setup. 99.999% of the time its either just simply less resolution, or the effect of some other artifact being introduced somewhere in the image processing pipeline after the acquisition stage. Grain aliasing is largely an internet myth being propagated by people who say it's grain aliasing whenever they see grain they don't like. No it isn't. Real grain aliasing is extremely ugly, and you'll know when you see it. If there's something somebody doesn't like about the grain they're seeing, instead of just saying it's grain aliasing, actually try to figure out why it looks that way. Maybe the answer actually is more resolution, or maybe the answer could be using a different raw image processor to see if you get different results (or a different demosaicing algorithm), or maybe the answer is not blindly sharpening the daylights out of it like many people do with digital images, but it's very unlikely to be actual grain aliasing, especially if using anything but the few known scanner and software combinations that do actually produce aliased output, and those combinations haven't been made or widely available for a long time now, which is a good thing. Now if we can just get people educated enough know that it's unlikely be that, but rather something else.

You sound as if I'm criticizing your services, but I have never seen your scans (if 3000dpi your target then it's purely a coincidence) and maybe I'm just not your target audience?

Almost nobody here on Photrio is in my target demographic, and that's OK. Over 80% of my clients ask for scans that are 1000x1500 pixels or 1500x2250 pixels because they don't want big file sizes and that's more than enough resolution for how they view the images, which is either on their phone, or on their TV. For most, I typically scan at 2000x3000 and scale down to whatever the target resolution is because I want final output to be as sharp as possible, and I'm still dealing with fairly small file sizes so it doesn't cost me a lot of time or storage space to do that. I can deliver quite a lot more resolution than that, but very few people want it, or are willing to pay for it, except for that one special roll here or there.

I can't quite understand what exactly you're disagreeing with?
We got sidetracked down in the weeds a bit. The original disagreement was that more resolution was diminishing returns. I've already stated my case for why it is, so won't re-litigate it here again. If that doesn't work for you, that's OK, after all we're a couple guys on a pretty hard core enthusiast forum discussing things that most people don't understand, don't care to understand, and largely don't care about.

My original post which asked for a 5,000dpi device specifically referred to low-volume enthusiast grade scanner like a modern X1. Apparently a scanner with such specs was a good idea back then, and I am a bit lost why you think it's not a good idea now.

I don't think it's a good idea now because I highly doubt it'd cost less than just simply getting a nice macro lens, a 40+MP camera back, and a decent copy stand, light, and film holder setup. The hardware isn't really the challenge. It's the software. It's always been the software. Even the Auralab that originated this thread is apparently still trying to work out the software, and it's been what, at least 6 months now? It's just not that easy to do in a cost efficient manner. The better course would be to provide a general purpose software solution that worked with a wide range of input sources, and maybe put together one or two verified hardware packages or hardware lists of commonly available off the shelf hardware for those that don't want to faf about so they can either just buy the kit as a turnkey, or buy the parts on the list, then buy the software license. You'll burn about the same amount of money getting the code to work, and you don't have the cost of developing your own hardware, or the long term expense of supporting your own custom hardware. Once everybody who is going to buy one has bought one, you'll have no more money coming in and still be expected to support it. The only way to get more revenue in is bear the expense of developing new hardware and getting your customers to buy it again, which if your hardware was any good, they'd not be likely to do unless they just had to have the highest resolution and was willing to pay for it. You could also generate new software features that they could buy, but again, if the hardware never changed (because it wouldn't unless you changed it and got your customers to re-buy it), you'll pretty quickly run out of new software features and run into changing the software for nothing more than the sake of changing the software so you can sell a new version. We all know how much users hate that. You could also generate ongoing revenue by charging a yearly support contract. That works for businesses selling to businesses, but how many consumers really go for that? No, the best long term course of action is to make software that works with a wide range of commonly available inputs, and have a built in "support contract" where users get free minor version releases and patches, but have to pay for major version upgrades. Start with the first version being functional but not the most full featured thing out there, and modest support for the most commonly available cameras, then over time, determine what goes into a minor version, and what goes into a major version based on how much it's going to cost you to develop it and how many people would be willing to pay for it. You'll still eventually hit the point where everybody who is going to buy it has bought it, and will either need to start selling a monthly/yearly subscription, or have to start changing the software for the sake of change so you can charge for new versions. That point in time would be 5-10 years down the road from when the code was first released, but it'd be hit nonetheless once the code does everything most of the users want it to do, and everybody who is willing to pay for it has done so. The difference is you didn't have to bear the expense of coming up with your own hardware or the expense of doing your own hardware support.
 

Steven Lee

Member
Joined
Jul 10, 2022
Messages
1,398
Location
USA
Format
Medium Format
Now we're talking!

maybe put together one or two verified hardware packages or hardware lists of commonly available off the shelf hardware

This is kind of what I had in mind. I chatted with Andre and Nate, the creators of NLP and Negmaster, and the absence of standardized hardware is a major PITA for developing color inversion software. IIRC you also posted somewhere here that the software you developed only works well for your camera and your light source, not for others.

The "scanner" I'm dreaming of would consist of something like a film toaster combined with a process lens with auto-focus and a 100MP area sensor without a color filter array, doing 3 separate RGB exposures in quick succession using optimized light. Bonus points for a built-in motorized stage for digitizing large format with multi-exposure stitching like in Creos. The output should be just a raw DNG with a known color profile.

This would provide a common hardware platform for all of you guys to build your software for. It is extremely rare for any company to be equally good at software and hardware at the same time. If even Porsche cannot build a website and a mobile app that work, what can you possibly expect from small and independent projects? Decent mid-career software engineers are banking north of $200K even outside of California these days and this caliber of talent will never work for Auralab. They should stick to hardware only and leave the software to people like you or LaserSoft.

No, the best long term course of action is to make software that works with a wide range of commonly available inputs
Well, sure. Except everyone is failing to do that. Every color inversion software I tried produced default results not as good as Nikon software supplied with Coolscans. In a separate thread I remember you posting how much extra work (years) it would take you to convert your software to accept commonly available inputs. Besides, nobody actually makes commonly available inputs dedicated to film scanning. Instead, users are suffering with old and antiquated scanners or re-purposing digital cameras for the task. My camera scanning rig was close to $7K and using it still sucks because every single component, except the Negative Supply holders, is not used as originally intended. And there's literally nothing I can spend more money on to improve my scanning experience. I can drop another $10K for a tiny marginal improvement but on the process/experience side I'll continue to suffer with a horrific contraption anyway.

My point is that it is a problem worth solving, and you with your software will benefit from it greatly. The market size is the problem and I hope it's growing.
 
Last edited:

Steven Lee

Member
Joined
Jul 10, 2022
Messages
1,398
Location
USA
Format
Medium Format
Creating a separate comment regarding another tangent you mentioned: business models. You raised good points there, the major one is the low LCV of one-time hardware purchase. But the elephant in the room is market size. Big enough market cures everything.

Let's say there 1M of people who're willing to spend $5K on a film scanner which offers meaningful improvement over their camera rig. Let's assume the scanner lasts 10 years on average and your market share is 25%. This means 25,000 units sold per year. If you take $3K from each sale, that leaves you about $1K per unit at ~65% gross margin, i.e. $25M/year for OPEX and profit. I may be off here and there for simpler math, but these numbers are not bad... assuming there's indeed one million people who'll want it.
 
Last edited:

gswdh

Member
Joined
Mar 12, 2022
Messages
56
Location
Europe
Format
35mm
Creating a separate comment regarding another tangent you mentioned: business models. You raised good points there, the major one is the low LCV of one-time hardware purchase. But the elephant in the room is market size. Big enough market cures everything.

Let's say there 1M of people who're willing to spend $5K on a film scanner which offers meaningful improvement over their camera rig. Let's assume the scanner lasts 10 years on average and your market share is 25%. This means 25,000 units sold per year. If you take $3K from each sale, that leaves you about $1K per unit at ~65% gross margin, i.e. $25M/year for OPEX and profit. I may be off here and there for simpler math, but these numbers are not bad... assuming there's indeed one million people who'll want it.

I think 1M customers is extremely optimistic but I don't have any evidence suggesting what it should be.

For the majority of time software has been sold, it was sold as a one time purchase without the ability to continually update it and break it. I don't see the software as being a challenge. I understand the issues with the the current software on the market however those problems are easy to solve. Colour calibration can be easily achieved with reference films.

With the current market as it is I think the best way would be to have one hardware and one software pairing. Designing software to work with many different systems complicates it and causes problems. Like I say, I've already created simple, working software for my film scanner and I did this in about a week's worth of evenings. It could do with some more features but they're not too difficult to implement. The market really isn't big enough for many scanners, perhaps a fast 135 scanner and a Flextight like high resolution, multi-format scanner.
 
Photrio.com contains affiliate links to products. We may receive a commission for purchases made through these links.
To read our full affiliate disclosure statement please click Here.

PHOTRIO PARTNERS EQUALLY FUNDING OUR COMMUNITY:



Ilford ADOX Freestyle Photographic Stearman Press Weldon Color Lab Blue Moon Camera & Machine
Top Bottom