What is the difference between SetPrecision and `?
When you put ` in the Documentation Center, it returns Precision. So what does adding the backtick behind a number do?
When an arbitrary precision number is entered with a backtick, the decimal mantissa/significand is used, so what effectively happens is
On the other hand, as documented SetPrecision[0.1, 22] takes the machine precision number 0.1 and pads it with binary zeros. In base 10, the added padding digits will not be zero in general. Try comparing the results of RealDigits in base 2 and base 10 for 0.1, 0.1`22 and SetPrecision[0.1, 22].
Thanks for the helpful reply. To make sure I understand, when using SetPrecision, the 0.1 is converted to binary, which must be an approximation since 0.1 is not terminating in binary, and then padded with 0's to achieve the desired base 10 precision. It seems that padding with 0's is not causing the difference but rather the conversion to binary. Is that right?
Yes, that's exactly right. The padding still takes place, but it is the binary truncation that results in these extra digits:
In:= FromDigits[RealDigits[0.1, 2], 2]
You get the same result from
as well. So it seems just converting 0.1 to binary causes a difference. How do you get the Input/Output pairs displayed the way you did?
Yes, sorry, I realized that in the meantime and edited my last reply.
The code formatting feature can be used by typing Ctrl-K or using the <> button