Hi all,
analysing some code for learning the WL properly I came across the following "feature":
{a, Sequence @@ {}, b}
and
{a, Sequence[], b}
both give the identical result (namely {a,b}).
But e.g. inside an "If"-statement this makes obviously a big difference:
If[# < 1, Sequence @@ {}, #] & /@ {0.5, 1.5}
gives {1.5}, which is the expected result. But the use of
If[# < 1, Sequence[], #] & /@ {0.5, 1.5}
results in {0.5, Null}, which is kind of the opposite of what was wanted.
In trying to understand this "Trace[]" might be helpful: It seems to me that in
the "Sequence@@{}"-case the If-statement is evalutated first (?) and in the
"Sequence[]"-case the arguments of the If-statement - as a sequence - are evalutated first (?).
In general my feeling is that in WL there is nothing which is not clever in some way!
But where is here the underlaying concept?
Many thanks for any comment!
Henrik