"OK, I have a snowy scene and a polar bear. If I 'shoot to the right', my pixels are all up in the upper range of the histogram. If I did not see the scene, how do I known the pixels truly belong in the upper brightness range, vs down at the lower end? I have a black coal mine scene and a black cat. If I 'shoot to the right', my pixels are all up in the upper range of the histogram. If I did not see the scene, how do I know the pixels truly belong in the upper brightness range, vs down at the lower end?"
First and foremost, you probably do not have those scenes, nor do most people. Even in such extreme examples, they still illustrate nothing of your point.
Next, I do not understand your supposition "If I did not see the scene". Of course you saw the scene.
Next, you seem to assume that photography is supposed to provide a realistic and literal reproduction of reality. Not only can it not, but why would you want it to?
You also assume that using a histogram means that you are following the expose to the right rule. This is not true at all. That is a shortcut followed by the types of people who rely on shortcuts. A histogram is simply a piece of information. It tells you contrast and tonal placement (which you claim it does not), among other things. You are blaming the tools when you should be blaming the users. Do you call sports cars useless simply because lots of people driving them are bad drivers? You are making a useless and purely technical argument.
Additionally, if you are going to make a purely technical argument, you have to play "devil's advocate" in a way and make the technical argument across the board. There are very sound technical reasons why "expose to the right" gives the most versatile exposures. The expose to the right rule exists because the ways one optimally exposes digital are different from the ways one optimally exposes film. There are many digital people who could argue far better than I can why you would, in fact, expose that coal mine scene to the right. It has to do with the fact that more printable information is captured by doing so. Just like film, the capture step simply gives you what you need to make the best final product in the darkroom. Do you think that digital pix are different from film pix in that they magically pop out of cameras ready to print? They are not. Your particular printing process informs your exposure decisions, as do your media of capture.
Even so, I feel the same as you do about spot meters and placement. The Pentax Digital Spot is my #1 meter. I tend to approach digital the same way I approach film, simply because I am comfortable with the familiar methods and I usually get fine results. This does not mean that I am an absolutist or that I am ignorant of the benefits of technology, however. Sometimes, due to different media, this approach does not give optimum results, or even printable results, and one must alter the familiar way of working in order to get the DESIRED RESULTS; not to prove some useless technical argument. Regardless of media, to get what you want, all photographs must be captured in such a way that give you what you need to make the print. Not perfectly, not ideally. You need to know what you want to do it best, but if you are unsure, or might want to do several different things with it, there are useful rules of thumb for film and digital that will simply provide you with a usable and versatile exposure.
To blindly dismiss histograms and expose to the right rules as a "crutch" is to be wrapped up in principal and technique rather than to focus on practicality and results. What you are really just doing is saying that you don't like histograms, and you don't need them, because you are technically good. Well, just come out and state it as a personal opinion. Don't try to prove it.
"The point I am making, on this analog forum, is that a well know reason for use of spot meters allows you to deal with scenes in a manner in which no conventional camera meter will permit, even with the digital crutch of a histogram."
Well, of course. Where did I say to throw away your meter cuz you have the be all and end all of exposure tools: the histogram? I'm not saying you are wrong about metering being preferable. I am saying that you are wrong and arrogant in totally dismissing a photographic tool on principle, and trying to turn that matter of personal principle into a technical argument.
"I don't care that I didn't impress you, because impressing anyone was not the point I was making. The point was that 'placement' is something consciously chosen in the act of exposure, and the spotmeter permits you to do that well, unlike a histogram. I can choose which brightness level in the scene I want at the mid-point, using the spotmeter."
You still do not seem to be able to admit that a histogram can be a useful tool that it tells one contrast and tonal placement. It displays the "distribution of pixels", as you say, but you can use this to gather much useful information, including placement and fall of high and low tones, and overall contrast, among other things. Your statements are like saying that, for instance, a graph of speed over time doesn't tell you much because it is nothing but a line and some numbers. It tells you max speed, min speed, average speed, at what speed you were at what time, etc., etc. The power is in not in the line and the numbers, but in the human brain's interpretation thereof.
Nobody is telling you how to work. In fact, I imagine that we work similarly and have similar personal working principles. What I am suggesting is that you not throw your opinion around heavily as uninformed fact, and informing you that you are coming off as a braggart, which is laughable and annoying in person, but especially on the Internet, where everybody here might as well be Adam. I, and I imagine others, grow weary of claims of greatness on the Internet.