Yeah; QTR actually doesn't really care what kind of units it's being fed. However, since you're apparently applying the curve in PS or GIMP anyway, you might as well do the whole thing in Excel. The QTR tool seems nice, but it's kind of a black box and I've never had much luck with it; it tends to complain about the data a lot even when there's no clear reason for it. I try to steer clear from it most of the time.
I guess I’ll just have to scan the wedge and use one of the tools that work off the scanned image (or just measure the densities in PS?)
L-value of original digital step wedge | L-value measured on printed sample | Normalized, inverted L-value |
100 | 34.77 | 0.0 |
95.8 | 35.72 | 1.6 |
91.3 | 36.18 | 2.4 |
86.7 | 36.99 | 3.8 |
82 | 36.41 | 2.8 |
77.7 | 35.91 | 1.9 |
72.9 | 37.46 | 4.5 |
68.1 | 35.78 | 1.7 |
63.2 | 38.25 | 5.9 |
58.6 | 39.07 | 7.3 |
53.6 | 39.68 | 8.3 |
48.4 | 42.43 | 12.9 |
43.2 | 44.84 | 17.0 |
37.8 | 48.25 | 22.8 |
32.7 | 52.09 | 29.3 |
27.1 | 57.22 | 37.9 |
21.2 | 65.26 | 51.5 |
15.2 | 75.3 | 68.5 |
9.3 | 84.46 | 84.0 |
3.6 | 91.97 | 96.7 |
0 | 93.94 | 100.0 |
Yes I configured the RPi 4 for that resolution.
The display-board is actually basically "just" FPGA, I discussed about this with SUMOPAI guy. I was of course thinking myself that this could be fun FPGA project but as the RPi works so well, I don't see any point - maybe waste my time on printing and calibrating the last bits.
The "problem" with this solution is that the prints look absolutely perfect. There is no sign of analog process, if you don't pick up the paper you cannot tell if it is a quality inkjet print or silver gelatin wet print. Only benefit I can see of doing traditional B&W prints that these are lasting longer than inkjets.
I have now found a PC for this and will order the screen next week, but I am a bit confused by some things...A quick update, I have been working on the software end of things and have many improvements. I changed the conversion algorithm and it is 100x faster. It can now create images that when displayed in sequence will allow for 10 and 12 bit (~4000) levels of grayscale. The exposure module steps through them automatically. The software works with a USB power strip to control the enlarger light source. The monitor is turned off so you initiate exposure with the space bar.
I have now found a PC for this and will order the screen next week, but I am a bit confused by some things...
The program will use a tif to generate the exposure frames, but will it convert the tif to the correct size automatically, or do I need to use the separate converter program?
The USB power strip you mention, could you point to the one you have used? I am unsure if I am able to switch the AC line on/off with the ones I see here for sale.
Thanks for helping!
The power usb strip is described here https://www.cnx-software.com/2013/01/06/powerusb-computer-controlled-power-strips-review/ . Unfortunately it looks like the site is down for the product. You can use your regular timer for the light on your enlarger, you just put the exposure in loop mode in the software with the approximate exposure time. I can't guarantee it will work with any other power strip.
The converter software is a standalone, you don't use it with the exposure software. The exposure software takes care of everything you just need to make sure it is 16 bit monochrome tiff and that it fits on the LCD panel (7680x4320).
I guess one of these would be an even easier way to integrate the lamp control than a microcontroller....
usb-relay-hid/doc/Readme_USB-Relay-DLL.md at master · pavel-a/usb-relay-hid
Software for USB-connected relays with HID interface. See the WIKI for more info. - pavel-a/usb-relay-hidgithub.com
And here a library in C#
GitHub - Neumeier1961/USBrelayDevice_NET: Provides control of HID compliant USB Relay Devices using .NET
Provides control of HID compliant USB Relay Devices using .NET - Neumeier1961/USBrelayDevice_NETgithub.com
I have just set mine to keep the image displayed for 6000 seconds, and I use my regular enlarger timer to control exposure times. The enlarger timer does take an input from a foot switch, so I guess I could rig something to control the timer via that input - but I find it easier at the moment to just hit expose on my touchscreen, cover the screen, and from there work as though I have a normal negative in the holder.
That is what the loop checkbox is for, it will cycle through the images continuously. You should set the time for the sequence to approximately what your print exposure is, unless you are setting it to 8 bit. Then it doesn't matter.
Yeah, I have it on loop - but I wasn’t paying too much attention to the loop time - my exposures are up around 120s, and I think the time was set to maybe 10? I assumed less than the exposure time would work ok?
Another tinkerer has joined the party. I bought an 8k 10.1” display, a dell 7050 sff pc linked earlier in this thread, a power supply upgrade and an nvidia 3050 graphics card. I have a custom screen resolution set from the screenshot linked earlier.
I am ready to plug it in and try it but I am unsure how to set the resolution for the first time. I would like to avoid destroying the screen by driving it incorrectly like @avandesande did the first time. I am guessing the hdmi adapter board was designed to support a limited set of configurations and does not refuse to drive unsupported configurations in a safe way like common consumer electronics does. The monitor that came with the pc is too low spec to use the high custom resolution so I can’t set it and then change to the lcd display (perhaps I can if I can learn how to work blind)
I am leaning towards this:
-attach screen to adapter
-plug adapter in to usb c power
-plug adapter hdmi into second hdmi port
-use the nvidia display settings to configure the second display from the first monitor.
Can anyone on this thread provide tested steps?
Do the above steps look reasonable
?
@avandesande what did you do to destroy your first screen? Did you just plug it into to the intel integrated graphics port?
Could you post your lut file once you have it? That looks like a handy tool. I will probably use it next.I ended up using the manual entry option on this tool https://thewetprint.com/curve/ to make a second curve to apply over the top of my eyeballed one - which seems to have worked perfectly
Hi Ethan, here is a link to a raspberry pi image sent for me by the manufacturer that works with the 16k screen.Wish I’d found this thread a couple months ago! It would have definitely saved me some time.
I’ve just finished building a digital negative carrier using a the 10.1” 16k lcd (I like big prints… ). Similar to what it sounds like you’ve done, I wrote a program which turns a 16-bit grayscale tiff into a series of 512 3-bit frames to achieve 12-bit tonal depth. The enlarger I’ll be using it with is in storage, so I need to get that out and set up before I can start printing, but the digital stuff is done.
Have any of you successfully run your displays off of a raspberry pi? I’m told it should be possible, but I haven’t been able to get Linux to talk with the display properly
I've been making slow progress on this project. Everything has gone smoothly so far.
I managed to get the video driver configured which took a few days 'cos the Sumaopai suggested one was not quite right ( will post mine for reference).
I made a frame for the lcd screen out of layers of mattboard.
I have made a few experimental test prints with and without filters, and have a few using @avandesande's lut.
It looks like I will have problems with getting enough light. printing a 1/4 5x7 at f/8 requires 16s.
Don't have prints looking right yet. First tests are using Ilford MG WT FB semi-matt
My next steps:
Workflows:
- Scans of a no filter print of @avandesande's step wedge with good blacks.
- Set up a github project for open source contributions.
- Make a 3d model for the lcd for a 3d printed version
- Post my own equipment list, costs, scans and curves
- Make a uv photopolymer film contact test.
- Decide on the workflows and features I would like
- Print purely digital files. e.g. from a digicam: not that interesting for me, but it seem like a good exercise for getting things precise.
- Print scanned negatives: Will most likely do this most often. Have a few almost unprintable negatives that I have fixed well enough digitally.
- Scan on Epson V850
- Invert and correct in Photoshop
- make a paper print that looks like what I see on the screen
- Digital dodge and burn: place a negative that needs dodging and burning on the lcd. Use Gimp or photoshop to make grade 1 and 5 dodge and burn masks. Use the computer to keep track of the exposure plan. Problem for me is that the lcd cuts out a lot of light. I might have to investigate buying/making a high power LED head for my Beseler 45MX. My computer also doesn't control the enlarger on/off switch.
- Simulate analog workflow on computer. For purely digital purposes or as an analog aid
- Scan negatives using a constant linear profile. I wonder how one does this in a way that can be communicated to/replicated by others? Scan with Stouffer step wedge perhaps.
- Print step wedge onto paper using each standard workflow: each filter grade. record exposure time. You can print other processes like cyanotype, salt prints or photogravure too.
- Print toning variations e.g. selenium or farmer's reducer etc.
- If you are precise, you only have to do each thing once. If we can communicate process precisely, only one person has to do a step once. ( this would be a fun (for the right people!) exercise in itself)
- Now you build an app that applies these curves to the scanned negative. The app will let you adjust the exposure time and you can see the effect ( I imagine fancier versions that simulate the mottled look of a cyanotype or the look of Fomatone MG glossy vs Ilford MG Satin )
- The first result is a digital image that is closer to what your photo would look like printed on your favourite paper which is a desirable result in itself ( Inverting scanned negatives is a terribly imprecise, and has taken me years of practice to get at least decent results. The charm of all the wonderful papers is nowhere to be found in this process.)
- The second result is a plan for how to print your negative to get the look you liked in the app. Every bit of imprecision will add up so it might not be exactly what you expacted.
Could you post your lut file once you have it? That looks like a handy tool. I will probably use it next.
@avandesande are you going to release the source code to your exposure app under an open source license? If I wanted to experiment I would have to start my own app from scratch, and you have already put a lot or work into making a workflow that works well for you. I bought the Dell 7050 SFF pc you linked which came with Windows 10 so I have been able to use the exes you posted and the app mostly works fine
Hi Ethan, here is a link to a raspberry pi image sent for me by the manufacturer that works with the 16k screen.
Congrats on working out the 3 bit conversion. Would you care to share the bit mapping? I spent about a day on it and gave up.
I thought it would be something obvious like
111 111 11 1 111 111 1 11 111 111
but I could never get it to work right. It always had problems on high resolution targets and I couldn't get anything out of it with the pattern generators I use for troubleshooting.
Thanks,
the bits are mapped so that the first pixel's value is (blue bit 1, green bit 1, red bit 1), the second pixel is (blue bit 2, green bit 2, red bit 2) and so on
Well, I thought I had figured it out, but I just ran a test and that might not be the correct… some pixels are showing up as the wrong values, so I’ve got to go back into the code and see what’s wrong. My suspicion is that one of the bits isn’t in the right order currently. Once I figure out where I went wrong I’ll let you knowWow, I am writing to the byte stream directly so this is going to require some ugly code. Did you figure out or did you find a reference somewhere? I asked the manufacturer for documentation or SDK and they didn't have anything.
Thanks!
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?