Resolution for power meters is an interesting subject. If you measure power, and the measurement is accurate as (or as NIST prefers - has a measurement uncertainty of ) 0.2 dB, a measurement of 0.00 dB +/- 0.2 dB is confusing. The 1/100 th dB resolution is in fact meaningless. If the uncertainty was 0.02 dB, a hundreth resolution would make sense. Now remember we are talking "absolute power" measurements, calibrated relative to NIST standards.
If we are looking at loss measurements, things changed considerably. The loss of a LC connector, about 0.1 dB, is measured relatively, eg. -15.00 dBm to -15.10, and the measurement uncertainty now has nothing to do with the absolute power levels, but the RELATIVE difference between the two readings. That difference is as precise as the linearity of the power meter ( better than 0.01 dB) and the uncertainty of the mating of the connectors ( a few hundreths too?). So you certainly want a meter with 0.01 dB resolution to test connectors!
If you are testing an installed cable plant with say 3-10 dB loss, the uncertainty is probably 0.5 dB, so 0.1 dB is adequate. If it's a long haul network with 30 dB loss, the uncertainty can be over 1 dB, so 0.1 dB is much more than adequate.