Q: How do I read accuracy specifications when it shows % reading +/-counts or % reading +/- digits?
A: It's common when looking at specifications for meters and power supplies when you see accuracies that are specified by either % of reading +/- counts or digits. Before explaining about the meaning of this specification, the resolution must be taken into account because often accuracies are also based on specific resolution or ranges of the instrument.
Suppose a multimeter has an accuracy specification of +/-0.5%+10 in the range of 5 V for DC voltage. Assuming it is a 50000 count multimeter, in that range it should be able to read up to 4.9999 V.
Assume the measured reading is 2.0000 V. According to the accuracy on this particular range with this resolution, we take +/-0.5% of the reading, and add 10 digits or counts to the reading. The digits or counts mean you add to the reading you see on display (this varies depending on resolution).
For example, we first take 0.5% of 2.0000 V, which is 10 mV. Adding this with the +10 count/digits, the accuracy specification tells us that reading accuracies is between 2.0110 V (2.0000+0.010+0.0010) and 1.9890 V (2.0000-0.010-0.0010).
The counts are added to the end, based on the resolution of your reading.
The reason why resolution becomes important is because the digits or counts are added to the display reading. For example, if the resolution was only 2.000 V instead, taking the same accuracy specifications would mean the reading is accurate between 2.020 V and 2.000 V. Comparing with previous results, clearly they are not the same. The resolution defines how many digits will be displayed, but the counts or digits are just values added to the number being displayed.