Message Boards Message Boards

Why is ParallelMap Slower Than Map?

GROUPS:

Consider the following code:

In[3]:= AbsoluteTiming[Map[Sqrt, 1.0*Range[10^7]];]

Out[3]= {0.323223, Null}

In[5]:= AbsoluteTiming[ParallelMap[Sqrt, 1.0*Range[10^7]];]

Out[5]= {36.9965, Null}
POSTED BY: Frank Kampas
Answer
5 months ago

Because the Sqrt is a VERY fast operation. Send commands to the kernels, interpret it, execute, copy from 4 (6? 8?) different slices of memory, bring it through the links to the master kernel, recopy to memory, and then combining all of them takes time…

It can be improved by have a minimum number of 'slices':

AbsoluteTiming[ParallelMap[Sqrt, 1.0*Range[10^7], Method -> "CoarsestGrained"];]

For this case the, so-called, overhead is just much larger than the actual operation…

POSTED BY: Sander Huisman
Answer
5 months ago

So it's only useful for very complicated functions.

In[12]:= f[x_] = Total[ x^-Range[100]];

In[13]:= AbsoluteTiming[Map[f, 1.0*Range[10^5]];]

Out[13]= {10.9114, Null}

In[14]:= AbsoluteTiming[ParallelMap[f, 1.0*Range[10^5]];]

Out[14]= {3.83556, Null}

In[15]:= AbsoluteTiming[
 ParallelMap[f, 1.0*Range[10^5], Method -> "CoarsestGrained"];]

Out[15]= {3.97303, Null}
POSTED BY: Frank Kampas
Answer
5 months ago

Not necessarily complicated functions, also memory is important. For 10^7 elements, the memory overhead, and the general overhead of parallelisation is not in your favour ;)

POSTED BY: Sander Huisman
Answer
5 months ago

It doesn't work at all for ParametricIPOPTMinimize

In[21]:= Needs["IPOPTLink`"]

In[22]:= f[x_, y_] := x Sin[10 x] y ^2 Cos[5 y]

In[23]:= ipts = RandomPoint[Disk[{-1, -1}], 1000];

In[24]:= pf = 
 ParametricIPOPTMinimize[
  f[x, y], {x, y}, {x0, 
   y0}, {{-2, 0}, {-2, 0}}, {(x + 1)^2 + (y + 1)^2}, {{0, 1}}, {x0, 
   y0}]

Out[24]= ParametricFunction[ <> ]

In[25]:= AbsoluteTiming[sols = Map[pf[Sequence @@ #] &, ipts];]

Out[25]= {4.55567, Null}

In[28]:= IPOPTMinValue[sols[[1]]]

Out[28]= -1.82848

In[30]:= sols[[1]]

Out[30]= IPOPTData[1]

In[27]:= AbsoluteTiming[
 solsp = ParallelMap[pf[Sequence @@ #] &, ipts];]

Out[27]= {0.330978, Null}

In[31]:= solsp[[1]]

Out[31]= ParametricFunction[2, 
  Internal`Bag[{Hold[
     ParametricIPOPTMinimize[x y^2 Cos[5 y] Sin[10 x], {x, y}, 
      ParametricIPOPTData[
       1, {x y^2 Cos[5 y] Sin[10 x], {(1 + x)^2 + (1 + y)^2}, {{0, 
          1}}, {x, y}, {{-2, 0}, {-2, 0}}, {x0, 
         y0}, {{1, 1}, {1, 2}}, {{1, 1}, {2, 1}, {2, 
          2}}, {"nlp_lower_bound_inf" -> -1.*10^19, 
         "nlp_upper_bound_inf" -> 1.*10^19}, Automatic, True, 
        IPOPTMinimize}, {x0, y0}]]]}, 1, 1], 0, 
  1, {{x0$6679, y0$6680}, 
   System`Utilities`HashTable[
    1, {{y0$6680, y0}, {x0$6679, x0}}], {}, {}, {}, 
   None}, {NDSolve`base$6681, 
   ParametricIPOPTData[
    1, {x y^2 Cos[5 y] Sin[10 x], {(1 + x)^2 + (1 + y)^2}, {{0, 
       1}}, {x, y}, {{-2, 0}, {-2, 0}}, {x0$6679, 
      y0$6680}, {{1, 1}, {1, 2}}, {{1, 1}, {2, 1}, {2, 
       2}}, {"nlp_lower_bound_inf" -> -1.*10^19, 
      "nlp_upper_bound_inf" -> 1.*10^19}, Automatic, True, 
     IPOPTMinimize}, {x0$6679, y0$6680}]}][-1.10076, -1.53568]
POSTED BY: Frank Kampas
Answer
5 months ago

I find your notation amusing…

Map[pf[Sequence @@ #] &, ipts]
Map[(pf @@ #)&, ipts]
pf @@@ ipts

Is pf a (pure)function? I'm not that familiar with ParametricIPOPTMinimize… You might have to distribute the definitions before? i'm not sure. I'm sure someone else has used this minimize.

POSTED BY: Sander Huisman
Answer
5 months ago

I didn't use @@@ as I wanted the change from Map to ParallelMap to be as small as possible, which I think is useful when troubleshooting a problem.

POSTED BY: Frank Kampas
Answer
5 months ago

Group Abstract Group Abstract