I bought a $30 digital Radio Shack multimeter recently and when I went to check the resistance on old and new replacement resistors I began to wonder about how accurate the MM really is. I hook up the alligator clips to the resistor, set the whole assembly down on a table and wait for the readout to stabilize. On the old, nominal 2.7Ω 5% resistor the readout fluctuates between 2.6-2.8Ω; on the new 2.7Ω (2%) resistor the readout fluctuates between 2.7-2.8Ω. Checking the old 4.7Ω (10%) resistor I get 4.4-4.6Ω; the new 4.7Ω (2%) resistor reads 4.7-5.1Ω.
So for my purpose of testing the resistors what value should I choose from the measured range, high, low or in between? (I assume that the percentage figure printed on the resistor is +/-. That is, 5% of 2.7 is 0.054, the resistor could vary between 2.65Ω and 2.75Ω) It appears then that all the resistors tested meet the old percentage specification, but the new 4.7Ω resistor does not meet the new 2% spec.
I also looked at the multimeter specifications which read, "+/- 1.2% of Reading, +/- 4 in last digit". Well, 0.4 is 15% of 2.7 and 9% of 4.7! So is this multimeter any good at all for testing these resistors?
Or am I overthinking again?