Message Boards Message Boards


Issue with automated function definitions

Posted 2 years ago
9 Replies
7 Total Likes

I am using version 11.3. I want to generate some definitions doing some time consuming symbolic computations. I just changed the time-consuming part to simple ones here.

 Block[{tmp}, tmp = x + y + z + 10; 
      f[x, y, z] := Evaluate[Module[{tmp1}, tmp1 = tmp; tmp1^2]]], 
  {x, -1,1}, {y, 0, 2}, {z, 0, 2, 2}]

In[2]:= ?f


The "=" in the output of "?f" should be ":=". If you change the range of x to "{x,-1,0}" , the answers for "?f" are using ":=". Similar for the other iteration variables y and z. Is this a bug or something?

9 Replies

I don't understand why you want to use delayed evaluation for numbers

Obviously, my original program is not just for numbers. On the right hand side of the definition there could be global variables. But this is not essential to demonstrate the bug.

The situation is as follows. For each set of arguments, the calculation will take a long time. Because the arguments can be enumerated, I want it to be partly evaluated in the definition for each set of arguments. I will copy the results and run separately, and thus given each set of arguments it may not run a long time when the other global variables are set on the right hand side. It seems that "=" also do the same thing as long as it is run before the global variables are set. Anyway, I prefer to use ":=" in the definition and just met this bug.

I don't know whether there are better way to do this.

Thank you for taking the time to inform us about this issue. I've forwarded the example to the developers.

For the majority of cases, you should always avoid programmatically defining down-values of functions. It's impossible in most programming languages and in any language I know where it is possible, not encouraged. That said there are cases where it's apparently unavoidable.

You can see a similar issue here:

"For the majority of cases, you should always avoid programmatically defining down-values of functions."

What would distinguish this from memoization? Stated differently, how is memoization not a form of programatically creating down values?

Posted 2 years ago

Memoization, if you want to view it as programmatically creating function definitions, is a least a very very limited form of it. And in most languages that have memoization it's not done as a side-effect of the evaluation of the function, but instead it's a language level feature. Even in Python, where you're likely to do with a decorator, you hide any obvious state and try to make it look like it's a language level feature. I'd saw the WL is pretty unique in having such an explicit and structurally obvious syntax for it.

The only reason it's not more commonly advised against, AFAIK, is that it's basically impossible to define a function in a loop like this in most languages. The closest I've seen in production code was a loop calling a factory. You have to use some very heavy abstraction to effect it.

Overall, programming advice is subjective, but I think the fact this is widely disallowed in most styles is a pretty solid sign. Most cases of this I've seen in WL tend to involve Symbolize and the creation of unique function names or the attempt to use Indexed/Subscript in some way.

Posted 2 years ago

Just my two cents, but I would say that this kind of thing is actually a powerful byproduct of the design of the language. I use programmatic DownValues / UpValues / SubValues all the time in meta-programming and OOP. To be honest, there's no really good alternative simply by design of the language. In particular, because Mathematica provides such weak encapsulation mechanisms, meta-programming is one of the only ways to cleanly provide an intelligently modularized interface without having to introduce hundreds of new symbols each of which requires maintenance and bug-fixes and documentation.

On the other hand, in this particular example there's no reason for this kind of thing.

I use it often as a way of caching values needed for lookup later. For example, say you wanted to implement the default version of DeleteDupolicates in the Wolfram Language. (I use this as an example because it was proof-of-concept code once upon a time for why/how we should improve on our shabby O(n^2) method then in use.)

The = sign in Definition[f] doesn't mean much: it's just cosmetic. You should look at the DownValues of f. As you can see, all DownValues are delayed rules:

In[13]:= DownValues[f]

Out[13]= {HoldPattern[f[-1, 0, 0]] :> 81, 
 HoldPattern[f[-1, 0, 2]] :> 121, HoldPattern[f[-1, 1, 0]] :> 100, 
 HoldPattern[f[-1, 1, 2]] :> 144, HoldPattern[f[-1, 2, 0]] :> 121, 
 HoldPattern[f[-1, 2, 2]] :> 169, HoldPattern[f[0, 0, 0]] :> 100, 
 HoldPattern[f[0, 0, 2]] :> 144, HoldPattern[f[0, 1, 0]] :> 121, 
 HoldPattern[f[0, 1, 2]] :> 169, HoldPattern[f[0, 2, 0]] :> 144, 
 HoldPattern[f[0, 2, 2]] :> 196, HoldPattern[f[1, 0, 0]] :> 121, 
 HoldPattern[f[1, 0, 2]] :> 169, HoldPattern[f[1, 1, 0]] :> 144, 
 HoldPattern[f[1, 1, 2]] :> 196, HoldPattern[f[1, 2, 0]] :> 169, 
 HoldPattern[f[1, 2, 2]] :> 225}

Furthermore: as a rule of thumb, I would say there's no difference between Set and SetDelayed[..., Evaluated[...]]. You can replace your function definition with f[x, y, z] = Module[{tmp1}, tmp1 = tmp; tmp1^2] and you'll get the exact same result.

It would seem to me that this should be equivalent to the above, with the {x,-1,0} range:

Do[f[x, y, z] = (x + y + z + 10)^2,
  {x, -1, 0}, {y, 0, 2}, {z, 0, 2, 2}];

The numbers are the same, but with "=" instead of ":=". The DownValues are exactly the same. Why does ?f format the output differently in the two cases, from the very same information?

Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
or Discard

Group Abstract Group Abstract