A modern digital transmission densitometer that uses a microprocessor and A/D converter - anything made after 1977 or so - doesn't/shouldn't need calibration. A transmission densitometer measures how well something transmits light with respect to nothing - and nothing is, well, nothing - the absence of any calibration standard.
A transmission densitometer should self calibrate by taking two measurements from the photodiode with nothing in the densitometer's light path: one with it's light on (the 'blank' reading) and one with it's light off (the 'dark' reading).
Density is then:
OD = - log10 ((sample - dark) / (blank - dark))
Where 'sample' is the signal from the photodiode when it is measuring whatever it is measuring.
The story is different for reflection measurements - there calibration is needed as a reflection measurement is the ratio of the sample signal to something that is 100% reflective (or through mathematical juggling the ratio of the sample reading to a reading of something that is of a known reflective density).
If you don't need to share data with other densitometers it doesn't really matter what you calibrate the meter too as long as you always use the same thing. Using glossy photo paper if you calibrate to bright base white at 0.15 and blackest black at 2.3 you will be close enough that it really doesn't matter. 1.57 OD will be _your_ 1.57 OD: call it 1.57 MD (My Density). That 1.57 MD equals someone else's 1.68 OD doesn't matter a hoot because you are always comparing two things: how does your print compare to your black/white calibration photograph. If you use the manufacturer's calibration tablet you are just comparing your print to the manufacturer's tablet.
To extend on the self calibration theme - you can make a very usable Zone print step tablet by
o Using #2 1/2 glossy paper find the exposure for the first pure white and
the first black - the place where it doesn't get any whiter and the place
where it doesn't get any blacker.
o Using the settings from your f-stop timer (or a time stops chart) divide
the exposure range found above into 10 equal parts. If pure white was 3.3
stops on the timer/chart and pure black was 7.5 stops then each zone
step from white to black is .47 stops.
o Made a set of prints at zone step intervals. For the above example
make a set of prints at 3.3, 3.8, 4.2, 4.7, 5.2, 5.6, 6.1, 6.6, 7.1 and 7.5
stops of exposure.
o You now have a zone step tablet. Keep it on file and use it for comparing
it's tones to your prints' tones.
Adams defines zones as the tones resulting from breaking the HD curve into ten equal expsoures. So you actually have a _true_ zone system tablet.
The same thing can be done for negatives to make a transmission zone step tablet, but you have to decide what you want black (the black part of the negative) to be - it is a good idea to pick a black that makes a pure white with #2 1/2 paper when a just-barely there density in the negative produces a blackest black on the print.
Older analog densitometers usually have lots of little fiddly adjustments because analog isn't very good at math and analog logarithmic circuits are very sensitive to temperature fluctuations (and any other event that may come along). The same is true of quasi-digital densitometers made before 1973 that have analog circuits but a digital display instead of the older meter/needle movement . '73-77 vintage digitals can go either way - uProcessor or analog signal processing.