I am running a code on Mathematica (on Mac OS X) that takes high values for some of the variables. In Notebook, I set the global precision using the following code (which is well documented!) at the top of my 'test_script.m' script to get desired precise results:
$PreRead = (# /.
StringMatchQ[s, NumberString] &&
Precision@ToExpression@s == MachinePrecision :> s <> "`50." &);
It works fine as long as I am using Notebook. However, when I try to run it using MathematicaScript Kernel using the following command line argument:
./MathematicaScript -script ~/test_script.m
; I am unable to set the precision and the computation gives inaccurate results. Just to make sure, I gave smaller test values to variables and both processes yielded exactly the similar results. This has led me to believe that in MathematicaScript Kernel, the 'PreRead' command is not effective.
So, Q: How can I set the precision to my desired value in this case; i.e. achieve the same result in terminal as I do in Notebook? Also, why is there a difference when using a Kernel compared to the Notebook? Shouldn't they behave identically? Is 'PreRead' command interpreted different in the 2 cases?
Your PreRead approach is an extremely hackish and fragile way of doing things! You may resort to such methods only if there is no straightforward/reliable way to achieve your goals (usually there is a way). For your particular goal you should simply use SetPrecision...
I found another case in which the above code fails. When loading data files, those numbers loaded are still with MachinePrecision. I understand the above code will need modifications but how? Can anyone give us some suggestions? Thanks.