When an arbitrary precision number is entered with a backtick, the decimal mantissa/significand is used, so what effectively happens is
On the other hand, as documented SetPrecision[0.1, 22] takes the machine precision number 0.1 and pads it with binary zeros. In base 10, the added padding digits will not be zero in general. Try comparing the results of
RealDigits in base 2 and base 10 for 0.1, 0.1`22 and SetPrecision[0.1, 22].