What I think onlookers are having difficulty with is believing that using your other open eye which sees the whole scene the brain is able to superimpose the scene over the black viewing screen and you can then move the spot to whatever you have chosen in the scene as you metering spot.
On the one hand provided your brain in able to superimpose the scene for long enough to aim the spot and secure a reading then it should work fine. It almost seems too easy to be true but on the other hand none of the three reviewers seem to have any difficulty in doing what the inventor says you need to do.
Couldnt you just use phone as a spotmeter? My lightmeter app does allow you to zoom into the image to take a more specific reading but not as narrow as a spot but couldnt you just write an app where you take pic with phone and it reads individual pixel luminance calculate contrast range and then zoom into image and precisely select a spot. Id be surprised if it didnt exist already. A new spotmeter seems a bit archaic when we all carry a phone with us which is already a meter coupled to a powerful computer.
One could go to all the trouble to write, test and debug the software, but I doubt that one could get a 1 degree spot and aim it correctly. Then adjust it for the Zone System. Forget it, I will stick to the Pentax Digital Spot Meter.