I have a Fluke digital multimeter and also an Amprobe clamp-on from many years ago, I.e. needle volt/ammeter.
I also have a Volt/Frequency monitor that I use in my Motorhome.
Since I'm currently messing around with a freshly installed stand-by generator I'd like to be as accurate as possible, but none of the meters really agree. I realize that some of the error may just be the +/- tolerances of each, or perhaps the type of meters.
Making frequency my priority I wanted to be sure my RV monitor was accurate so I went and bought a True RMS Clamp-On today that also does frequency. I also wanted a meter to use with UPS's.
The big 'shock' for me was that while Hz was 60.0 on the RV monitor and 59.9 on the RMS the volts were 107 on RV monitor and 113.5 on Fluke and the new True RMS meter (within a couple of decimals that is)
The Amprobe was showing 112 but it's an approximaton at best as the gradiant on the scale is pretty minimal.
The new meter is an Extech 380926. Any experiences out there with this model and perhaps meters in general. I imagine even if I had 2 meters of same make I'd still get some differences.
I guess my overall question is whether I can trust the new Extech as being "the" meter to use since as a True RMS it should have better electronics than a 10 year old Fluke.
Sorry about the rambling, just trying to get some confirmation that I'm on the right track with the Extech.
There is no way to know without a calibrated meter. I have one Fluke 73 that I used maybe three times that is NIST calibrated and has the trace certificate. You pay about 75 bucks extra for the meter to have a certified calibration. The only way to know if you're on the right track is to compare all your meters to readings given by a calibrated meter. Best of luck to you...
On a side note to mdshunk's post, the same applies to temperature sensors. As the OP said +/- some percentage on either the meter or the temperature is about all you can expect if you do not have the equipment calibrated.